Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Information Factories Are Here

CowboyNeal posted more than 7 years ago | from the brave-new-world dept.

126

prostoalex writes, "Wired magazine has coined a new term for the massive data centers built in the Pacific Northwest by Google, Microsoft, and Yahoo! Cloudware is, ironically, a return of the centralized data and bandwidth power houses caused by the decentralized and distributed nature of the Internet. George Gilder thinks we're witnessing something monumental: 'According to Bell's law, every decade a new class of computer emerges from a hundredfold drop in the price of processing power. As we approach a billionth of a cent per byte of storage, and pennies per gigabit per second of bandwidth, what kind of machine labors to be born? How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?'"

cancel ×

126 comments

Supply of fiber too low for a revolution? (4, Insightful)

Salvance (1014001) | more than 7 years ago | (#16791228)

At some point massive data centers won't provide incremental benefits unless the massive increases in processing power are met with proportional decreases in bandwidth prices. Sure, bandwidth prices have dropped, but not by nearly the rate of price/teraflop processing has. Companies like Google recognize this, and are investing in their own fiber to compensate [lightreading.com] . But the telecommuncations companies are the ones that originally build these lines, and it's unfortunately in their best interest to keep the supply of spare bandwidth very low.

-1 OFFTOPIC (0, Funny)

Anonymous Coward | more than 7 years ago | (#16792788)

HOORAY FOR THE RETURN OF THREAD REPLIES!! Lameness filter encountered. Post aborted! Reason: Don't use so many caps. It's like YELLING.

Re:Supply of fiber too low for a revolution? (0)

Anonymous Coward | more than 7 years ago | (#16795540)

what?

Cloudware? (1)

ploss (860589) | more than 7 years ago | (#16791256)

Hopefully we've slowed its inevitable takeover of the race by giving it an extremely macho name. I for one welcome our fluffy cloudware overlords.

Synonym Myths. (2, Informative)

Anonymous Coward | more than 7 years ago | (#16791258)

" And how soon will it, in its inevitable turn, become a dinosaur?'"

I don't know if you realize this, but the idea that dinosaurs were an incapable species is a myth? They obviously didn't last millions of years because of any defects. But when a big-ass meteor comes crashing into the planet, any species capable or not would be hard-pressed to survive.

it's people (5, Funny)

User 956 (568564) | more than 7 years ago | (#16791272)

what kind of machine labors to be born? How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?

How will we feed it? Read the article about the robot that identifies human flesh as bacon [wired.com] and see if that answers your question.

Re:it's people (3, Funny)

gramji (875033) | more than 7 years ago | (#16792618)

or for the movie freak - Humans as batteries (The Matrix).

Losing the difference between here and there (3, Interesting)

BadAnalogyGuy (945258) | more than 7 years ago | (#16791274)

When we look at our current situation, we see that we have data 'here' and data 'there'. When we want to have more data, we need to go 'there' to bring the data 'here' for viewing. In the most extreme (and common) case, the data is only temporarily copied from 'there' to 'here' and once we are done with the data it is deleted from 'here'.

The future will eliminate that differentiation. Data will not be 'here' or 'there'. Rather, it will be. Data will simply exist and we will access it as if it were immediately 'here' all the time.

It will take quite a bit more technology to make this a reality, but the Internet is the first baby step away from separation of data repository and the user. Now, users can access data 'there' on a browser which is 'here' with a few keystrokes. In the future, this action of 'getting' data will be eliminated completely.

How I think that will occur is neither here nor there, but I guarantee that this is what will happen.

I'm sorry (5, Insightful)

Colin Smith (2679) | more than 7 years ago | (#16792638)

The future will eliminate that differentiation. Data will not be 'here' or 'there'. Rather, it will be. Data will simply exist and we will access it as if it were immediately 'here' all the time.


But this is the biggest load of new age bullshit I've heard in years.

 

Re:I'm sorry (2, Interesting)

oc255 (218044) | more than 7 years ago | (#16793294)

But this is the biggest load of new age bullshit I've heard in years.

No need to get all worked up, you'll never see it.

I'd say with enough memory on the user's machine, there would be no concern about storing information twice. Just as a BS example, imagine they get something like atomic memory working where a sugar-cube sized device can cache all the information we have. Now imagine that we have perfected quantum teleportation (I know, I know). All data could be replicated and cached instantly and there would be no delay in keeping the massive distributed grid sync'd. In this case, I nod at the OP. The data is just here or even there. It's all just existing.

Of course, the flipside is that tech will just complicate itself into a giant mess. And even though we have instant ISPs, we can't find the Linux driver to make our quantum modem work. Ok, so then you find the driver but the massive amount of traffic causes your 42.0 yottabyte /var parition to fill up with logs because you were running with DEBUG logging.

Re:Losing the difference between here and there (3, Insightful)

caluml (551744) | more than 7 years ago | (#16794022)

Data will simply exist and we will access it as if it were immediately 'here' all the time.

And precisely where will this data be stored, and how will it get to us? It's not some entity, omnipresent, floating around everywhere, that you can put your hand up, and pull out a load of data.
It has to be stored somewhere. And it has to get from where it's stored to where it's needed.

Re:Losing the difference between here and there (3, Insightful)

BadAnalogyGuy (945258) | more than 7 years ago | (#16794114)

Where is the data for Yahoo!s servers located? Where is the data for Microsoft's servers located?

Your GMail account's data? Do you know where that is?

No, of course you don't. Because you don't need to. You log in, access the data from the intarweb, fiddle with it, then log off. You aren't doing any of the copying, and the physical location of the data is totally irrelevant for all intents and purposes.

The intartubes are the first step towards removing the requirement of "transferring" data. While some data is pretty well virtualized, a lot of it like files over FTP and large downloads over HTTP are still cumbersome. Beyond that, there is still a differentiation between files on your computer and files on the Web. Why should you need physical access to your computer to access some files? Why do you need to connect via VPN to access those same files from a remote location?

When all things are connected seamlessly you won't ask where the data is stored because it won't matter. What matters is only your ability to access it when you need it.

Why Gilder Is Telecosmically Wrong (5, Insightful)

miller60 (554835) | more than 7 years ago | (#16791280)

Everything is getting cheaper but power, which for some data centers now costs more than hardware. Nicholas Carr [roughtype.com] explains why Gilder's assumptions are problematic:

"What Gilder calls 'petascale computing' is anything but free. The marginal cost of supplying a dose of processing power or a chunk of storage may be infinitesimal, but the fixed costs of petascale computing are very, very high. Led by web-computing giants like Google, Microsoft, Amazon, and Ask.com, companies are dumping billions of dollars of capital into constructing utility-class computing centers. And keeping those centers running requires, as Gilder himself notes, the "awesome consumption" of electricity"

As I noted in our commentary at Data Center Knowledge [datacenterknowledge.com] , the power issues with high-density blade server computing has been understood for years. Back in 2002, Liebert and APC and other equipment vendors were developing products that could address huge heat loads. They saw it coming, and sensed a market opportunity. So where were the chip makers? Even as cooling vendors prepared for the results of the huge power and heat loads, little was done to address their source.

yeah... (1, Interesting)

bostonsoxfan (865285) | more than 7 years ago | (#16791296)

I think that the Data center will eventually go the way of the dodo, or at least these massive data centers. We have all heard of Google's, and I think it is Sun who are trying to create, or have created portable data centers. How long is it before there is one in every town. Just like Starbucks or McDonald's. They are going to be able to serve up anything, tv, applications super quick because the latency is so slow. Also they aren't economical, they require tons of energy, cooling, personnel. One of these portable data centers probably will need one guy to check on it, maybe not even that, through remote diagnostics and redundancy built into the system, they might be able to afford to just have a couple guys service hundreds of them. I won't need to clog the pipes to search for something or read my email because it is in Washington, rather it is just going to be a short hop or two away from me. Increasing speed, decreasing overall bandwidth use and being good for the consumer. http://www.engadget.com/2006/10/18/suns-project-bl ackbox-datacenter-in-a-container/ [engadget.com]

The machine that labors... (5, Funny)

tcopeland (32225) | more than 7 years ago | (#16791312)

> "what kind of machine labors to be born?"

As the saying goes, don't anthropomorphize machines: they hate that.

Cheap colocation in the Great NorthWest? (0)

Anonymous Coward | more than 7 years ago | (#16791314)

If information factories are coming here then is cheap colocation and cheap bandwidth coming to the great NorthWest? Where can I find some?

BadAnalogyGuy, (0)

Anonymous Coward | more than 7 years ago | (#16791316)

You rock.

Ya right (1)

OverlordQ (264228) | more than 7 years ago | (#16791322)

pennies per gigabit per second of bandwidth

It's only pennies per Gbps if you measure your total bandwidth in shit-tons. Only way you get that good of a deal is if you buy in a very very large volume. Until the prices are like that across the board, I think this article can be shelved.

Re:Synonym Myths. (1)

RuBLed (995686) | more than 7 years ago | (#16791344)

I believe he just wants more oil for his SUV.

death of copyrights (4, Interesting)

argoff (142580) | more than 7 years ago | (#16791346)

What is going to happen, or what is happening, is that the service value of information is exceeding the content value of information, and will continue to do so at a greater rate from now on. The information age is doing to information services what the industrial revolution did for production. Eventually, information restrictions like copyrights will be such an incredible and annoying hinderence on providing information services that the financial pressure to kill them will become unbearable.

Stupid joke (1)

CrazyJim1 (809850) | more than 7 years ago | (#16791354)

And how soon will it, in its inevitable turn, become a dinosaur?'"

1) Develop AI 2) Engineer cars that transform into robots. 3) Use a stop watch to time the speed it takes to go from car to dinosaur. 4) Flee in panic.

Re:Stupid joke (1)

orangeyoda (958347) | more than 7 years ago | (#16793298)

4. Profit ?

Or that's what I thought, must be a glitch in the matrix

Me Grimlock (0)

Anonymous Coward | more than 7 years ago | (#16793876)

And me approve this message.

power doubles about every two years (1)

80 85 83 83 89 33 (819873) | more than 7 years ago | (#16791356)

from studying benchmarks, it usually takes about two years (and sometimes two and a half years) to double scores on pure cpu benchmarks and system level benchmarks (like sysmark).

salmon-ware? (1)

LM741N (258038) | more than 7 years ago | (#16791366)

Download a file- kill a salmon.

Re:power doubles about every two years (4, Interesting)

80 85 83 83 89 33 (819873) | more than 7 years ago | (#16791376)

the saying goes that computing power doubles every 24 months. but i have found that in the real world, the number is closer to 30 months.

the benchmark: Content Creation Winstone 2000. it works out all the parts of a pc.

(under windows 2000):

(introduced in May 1997)
intel pentium II 300Mhz
score: 15

(introduced in Oct 1999)
intel pentium III 733Mhz
score: 30

thats 29 months to double

under windows 98SE:

april 1998
intel pentium II 400Mhz
score: 19.5

nov 2000
intel pentium 4 1500Mhz
score: 42

thats 31 months to double

OUTLOOK FOR NEXT FIFTY YEARS
(for thirty month performance doubling rate):

in 30 months: TWICE the performance.
in 60 months: FOUR TIMES the performance. ...
in 25 years we will have ONE THOUSAND times the performance.
and, in 50 years we will have ONE MILLION TIMES THE PERFORMANCE!!!!!!!


will that finally be enough to make our computers as smart as we are? how many watts of electricity will it consume?

CPUmark99 doubling:
24 months

sysmark 2000 double time: 27 months

ccwinstone04 double times 30 months

Re:power doubles about every two years (1)

Yartrebo (690383) | more than 7 years ago | (#16792774)

What would you get by plugging in processors from 2006 into that formula? My guess is that the doubling time is even slower now.

Re:power doubles about every two years (1)

80 85 83 83 89 33 (819873) | more than 7 years ago | (#16795322)

you are very observant. it took 4 years to double the power of the 2.0GHz P4 ....

Re:power doubles about every two years (2, Insightful)

doti (966971) | more than 7 years ago | (#16792964)

will that finally be enough to make our computers as smart as we are?
Raw power is not what will make computer as smart as we are.
First, what makes computer "intelligence" is the software, not raw power. And we will need a substantially new software paradigm to get near our intelligence. I can't imagine how software can get consciousness and awareness. There are parts of the human thought that can't be simulated with a series of conditional numeric operations.

Because GHOD said so that's why (2, Interesting)

phunctor (964194) | more than 7 years ago | (#16793432)

If the human brain is made of matter, and matter obeys the laws of physics, what part of the physics of a human brain thinking can't be simulated? Your "proof by confident assertion" does not stand up. Don't feel bad, there's an extensive literature of distinguished philosophers attempting to make the same case.

Prior to Wohler's synthesis of urea, NH2-CO-NH2, from ammonium isocyanate, NH4+CNO-, there was a belief that "organic" matter had a mysterious "elan vital" which distinguished it from "inorganic matter". Wohler's synthesis demonstrated the uselessness of these categories.

Still, the idea won't die. It comes back, full of _healthy_ _natural_ _goodness_, again and again. Its persistence is itself a phenomenon worthy of ponderation. My best guess is that culturally we really haven't reconciled ourselves in our heart of hearts to the Earth not being the center of the Universe, and all the rest of the primal superstition.

Re:Because GHOD said so that's why (1)

doti (966971) | more than 7 years ago | (#16793550)

Did I said it was impossible? No, just that it would need a substantially new paradigm, not just raw power.
Learn to read first.

"There are parts of the human thought that can't (3, Insightful)

phunctor (964194) | more than 7 years ago | (#16794008)

be simulated with a series of conditional numeric operations."

You did say it was impossible. You didn't say anything about a new paradigm. Why you'd want to lie about your own publicly visible words totally escapes me.

Still, in case there's a there here: Are you claiming there is a class of problems, such as simulating a thinking human brain, that cannot be executed by a Turing machine? That is an extraordinary claim, and needs extraordinary evidence. Cite?

--
phunctor
"here's a shovel, keep digging"

How smart will computers have to get before... (1)

Archtech (159117) | more than 7 years ago | (#16793666)

... they realise that they are sucking up all the available power, and dooming the biosphere in the process?

Not that they would necessarily give a rat's ass about the biosphere.

Re:How smart will computers have to get before... (1)

ichigo 2.0 (900288) | more than 7 years ago | (#16794804)

My guess is, smarter than humans.

What I want to know is.. (1)

MaXiMiUS (923393) | more than 7 years ago | (#16791382)

Why did they get somebody to draw this [wired.com] little beauty, instead of taking a picture? Or is this one of those new-fangled pixel-per-pixel cameras? ;)

Oh, don't mind me, I'll go back to my corner.

Re:Why Gilder Is Telecosmically Wrong (3, Interesting)

Anonymous Coward | more than 7 years ago | (#16791392)

That's what I'm concerned about. We already have problems supplying human society with enough electricity. These data centers are being situated near power centers like the hydroelectric dams of the Pacific Northwest.

How long will it be until we start running into dilemmas concerned with whether data centers or people have priority over available electricity?

Has this already happened?

Once the economy cannot operate without the data centers, do we reach a scenario where keeping the data centers running must have priority over supplying electricity to homes?

At what point do the machines decide that instead of competing with humans for power, humans would make a useful power source?

(hm, interesting..."please type the word in this image: 'autonomy' ")

net pricing (0)

Anonymous Coward | more than 7 years ago | (#16793600)

maybe they will have to make a formula for selling internet content. megabits x distance x physcial hops (router/switch costs) x watts used or something. Maybe even have peak and off peak times in there some place. Obviously some data is worth a lot more than others to joe user. How this might work with net neutrality I have no idea, it probably couldn't actually, but it appears to be the real pricing scenario we have now, it is just not sold that way directly to the end consumer.

There's still lots of room for increase production (1)

ichigo 2.0 (900288) | more than 7 years ago | (#16794876)

How about increasing our power production capacity by building nuclear&fusion&solar&hydro&wind power? We can't avoid increasing our power consumption if we want to advance on the Kardashev scale [wikipedia.org] , so why not get it over with already.

version 2??? (1)

3seas (184403) | more than 7 years ago | (#16791400)

Didn't cloudware happen the first time during the 1990's and called "dot com" boom?

Genuine core knowledge will always, by its very nature, be very less demanding on
storage space than the hype and bable of what most knowlegde published requires.

ultimately the hype and babel will manifest a bottomless pit as we can see from spam experience.

Hmmm, now where is that key and lock for that pit?

Re: Knowledge Usage Ratings (1)

TaoPhoenix (980487) | more than 7 years ago | (#16792464)

Isn't knowledge usually text with the occasional graphics, and advertising media the resource hog?

What about a "content usage" scale where a user gets credits/ratings for disabling pushy ads, thus lowering bandwidth usage. Then we might have an option for $5 per month broadband speeds. The concept is like the low mileage insurance discount.

Except when I specifically download music files, my computing style evolved to be low tech because when my Satellite went out I was stuck on dialup.

Re: Knowledge Usage Ratings (1)

Yartrebo (690383) | more than 7 years ago | (#16792814)

The amount of bandwidth used by ordinary people isn't very big. Poor regulation and monopolies/oligopolies are the reason why it costs what it does from small customers.

On the other hand, disabling ads has many other benefits, so it's still a very worthy enterprise.

The last page of TFA... (3, Insightful)

d474 (695126) | more than 7 years ago | (#16791412)

This is pretty cool writing:
"The next wave of innovation will compress today's parallel solutions in an evolutionary convergence of electronics and optics: 3-D and even holographic memory cells; lasers inscribed on the tops of chips, replacing copper pins with streams of photons; and all-optical networks in which thousands of colors of light travel along a single fiber. As these advances find their way into an increasing variety of devices, the petascale computer will shrink from a dinosaur to a teleputer - the successor to today's handhelds - in your ear or in your signal path. It will access a variety of searchers and servers, enabling participation in metaverses beyond the ken of even Ray Kurzweil's prophetic imagination. Moreover, it will link to trillions of sensors around the globe, giving it a constant knowledge of the physical state of the world, from traffic conditions to the workings of your own biomachine."
Makes me want to read a William Gibson novel.

Re:The last page of TFA... (2, Insightful)

radtea (464814) | more than 7 years ago | (#16793772)

As these advances find their way into an increasing variety of devices, the petascale computer will shrink from a dinosaur to a teleputer - the successor to today's handhelds - in your ear or in your signal path.

Technological prognostications are almost always wrong in two directions.

1) The ability of current tech to scale up indefinitely is always eventually proven false. For six decades new aircraft designed increased their average crusing speed from about 100 mph in 1920 to 700 mph in the 70's. Then they stopped getting faster, and have been at just under 600 mph ever since. The jump to 700 mph in the '70's was with the introduction of the Concord, which also gave the average crusing speed a huge variance. No one knows what the "speed of sound" for computing will be, and as always it will be a matter of economics rather than pure technological possibility, but what we do know is that it is out there somewhere, and eventually we will hit it, although possibly not for some decades yet.

2) The uses to which tech is put, the directions and consequences of the speed and size improvements that do happen, are always almost completely wrong, as are the costs. As Asimov once said, the challenge is not to predict the automobile but the parking problem. Lots of people predicted some of the social consequences of the 'Net, but I don't think anyone predicted spam. These sorts of things may create limits that come into play earlier than other economic limits (and not incidently, create major opportunities for companies with solutions able to overcome them.)

As near as I can tell, the "parking problem" of the brave new world of ubiquitous interconnected computing, is the identity issue. Computers deal with proxies for everything. Unlike human beings, they are a realization of Plato's Cave, dealing only with the numerical shadows of reality. And one shadow can be made to look very much like another.

And they run on .... ? (1)

b1ufox (987621) | more than 7 years ago | (#16791420)

Win XP

if yes, which one SP1, SP2 ...?

Linux

if yes, which one Ubuntu, RedHat, Suse, Debian...?

Unix

if yes, which one HP Unix, BSD Unix...?

Windows Vista

if yes, ...then i must say wow you won't need any antivirus ;)

GoogleOS

when did you do that? :)

Interesting choices for running data centers though ...:)

I have signed up for S3 and EC2 (2, Interesting)

MarkWatson (189759) | more than 7 years ago | (#16791422)

... but I have not used them yet. My plans are to use EC2 for occasional machine learning or neural network training runs instead of tying up my own computers. I wrote about this on my AI blog (http://artificial-intelligence-theory.blogspot.co m/2006/08/using-amazons-cloud-service-for.html) a while back.

In general, I think that it makes sense to "outsource" basic infrastructure. I used to run my own servers, but after figuring the costs for electricity, bandwidth, and hardware costs, I switched to leasing two managed virtual servers - paying for the CPU, memory, and bandwidth resources that I need. I view Amazon's EC2 service the same way: when I need a lot of CPU time over a short time interval, simply buy it.

Re: death of copyrights (2, Interesting)

dch24 (904899) | more than 7 years ago | (#16791438)

the service value of information is exceeding the content value of information
Eventually, information restrictions like copyrights will be such an incredible and annoying hinderence on providing information services that the financial pressure to kill them will become unbearable.
I think you've got it. The Ask Slashdot - How Do You Make a Profit While Using Open Source? [slashdot.org] - is demonstrating the same thing: Open Source software is one more way in which the service value of having all the source code outweighs the value of executing the code.

Whether it's the MPAA/RIAA, or Microsoft, the meteor has hit the ground. The dinosaurs that cannot adapt may make a lot of noise in their death throes, but they will fade into irrelevance.

I think the .com crash is evidence of how poorly the mainstream understands this. Some of them talk about "Software As A Service," or "Video On Demand," but that's just commoditizing bandwidth instead of the physical media of the '90's. Open Source and Google will wipe them out by delivering more value.
my 2 cents.

[ Parent [slashdot.org] ]

Earth History 2025 (3, Funny)

secondhand_Buddah (906643) | more than 7 years ago | (#16791460)

...and in 2025 the Galactic publishing company, well known for their travel guide, The Hitchhikers guide to the galaxy, bought Google to include their data as a subset of the entry of a little planet in the backwaters of the universe, called earth. Just in case someone wished to travel there ....

it's going faster than you think (1)

robinvanleeuwen (1009809) | more than 7 years ago | (#16791470)

" in 25 years we will have ONE THOUSAND times the performance. and, in 50 years we will have ONE MILLION TIMES THE PERFORMANCE!!!!!!!" Yeah... we've been doubling for the last 25 years already, so in 25 years we're at a million already...

(What a long strange trip it's been) * 1.5e6 (1)

phunctor (964194) | more than 7 years ago | (#16793694)

I've been surfing that exponential for 40 years.

The IBM 1620 I first worked on had an add time of about 600 microseconds. This here Pentium, about 400 picoseconds. That's 1.5e6 speedup, but hey, who's counting?

The 640K DOS limit made me laugh out loud when it was first announced - because I'd watched the PDP11 evolve increasingly hairier address space extensions. But to get it you had to have been watching for quite a while already. Apple got it, went with the 68K.

--
phunctor the karmically challenged
bringing you senile musings, well, forever

RE: Synonym Myths. (0)

Anonymous Coward | more than 7 years ago | (#16791472)

I don't know if you realize this, but the idea that dinosaurs were an incapable species is a myth? They obviously didn't last millions of years because of any defects.
Well, you might say defects were paramount to the dinosaurs lasting millions of years. In hindsight, features start looking like shortcomings.

Teh horrible future! (0)

Anonymous Coward | more than 7 years ago | (#16791486)

As we approach a billionth of a cent per byte of storage, and pennies per gigabit per second of bandwidth, what kind of machine labors to be born?
Something with, say, massive amounts of storage and fast network connections?

How will we feed it?
Either with electricity or with human souls, depends on the engineers I guess.

How will it be tamed?
With free software or by cuting the power, depends if it's just a big computer or a full blown AI.

And how soon will it, in its inevitable turn, become a dinosaur?
2 years.

death of bad arguments. (0)

Anonymous Coward | more than 7 years ago | (#16791506)

Reply to (#16791438) [slashdot.org]

"Whether it's the MPAA/RIAA, or Microsoft, the meteor has hit the ground. The dinosaurs that cannot adapt may make a lot of noise in their death throes, but they will fade into irrelevance."

Talk about faith in "survival of the fittest." The meteor has done no such thing and furthermore your "death throes" argument not only flies in the face of reality, but the "but I'm not hurting anyone" one as well.

"I think the .com crash is evidence of how poorly the mainstream understands this."

Or how poorly you understood the dot.com era.

"Some of them talk about "Software As A Service," or "Video On Demand," but that's just commoditizing bandwidth instead of the physical media of the '90's. Open Source and Google will wipe them out by delivering more value."

Only if Open Source and Google get into the content creation business. You can't obsolete the cow by offering milk.

"I think you've got it. The Ask Slashdot - How Do You Make a Profit While Using Open Source? - is demonstrating the same thing: Open Source software is one more way in which the service value of having all the source code outweighs the value of executing the code."

He got no such thing, and if you read the link you provided. Open source service wasn't proven conclusively to be a long-term viable model. Just one of many that might work.

Gigasaurus? (1)

fahrbot-bot (874524) | more than 7 years ago | (#16791530)

How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?

Gigasaurus, we hardly knew you...

[Sniff. A lone tear edged forth; the opalescent bead sparkled in the candlelight and betrayed my true feelings -- noooo! Damn you technology! Damn you to hell.]

Gilder (1)

LauraW (662560) | more than 7 years ago | (#16791548)

Damn is that man long-winded!

It doesn't matter (4, Insightful)

sillybilly (668960) | more than 7 years ago | (#16791560)

It doesn't matter if computing performance doubles if the software that runs on it decays in performance at an even greater rate. Back in 1988 MSDOS used to boot in less than 10 seconds after the BIOS POST. Who cares if you'll have software with features greater than your brain, with capacity to even guess your thoughts, wishes and desires, if it will just do what you want without mouseclicks of speech commands, who cares for all these features if it takes 5 years to boot up on a computer a gazillion times faster than today's computers, and its processing speed is uttering 3 words per decade while consuming 900 gigawatt of electric power? Case in point: Windows Vista.

Blame it on the hard drives (1)

Vr6dub (813447) | more than 7 years ago | (#16794440)

I think the primary problem with boot times are hard drives and their incremental speed improvements relative to other computer components. I think we're doing pretty good considering the amount of data that needs to be loaded. Granted software has become a bit bloated but think how much more you can do with today's modern OS's compared to MS-DOS. Hopefully hybrid drives eliminate the problem. Besides, my new Core2Duo HTPC runs all my applications (Photoshop, Adobe Premier, and games of course) with great speed. In fact the only time it EVER seems slow is when doing large read/writes to the hard drive.

should be warehouse, not factory (0)

Anonymous Coward | more than 7 years ago | (#16791576)

Factory implies they are making the data.

Warehouse means they are storing it.

I have a weird related story... (3, Interesting)

Sargeant Slaughter (678631) | more than 7 years ago | (#16791592)

One day, about 4+ years ago, when I was working at a porn shop, this Russian computer scientist came in. He seemed pretty smart and was yakkin about optical computers and some project he had worked on in the 90's in Russia or something. Anyway, he said this would happen, the return of the paralell mainframe. He said that we were reaching the limits of current silicon and copper materials. With optical still a long way out, he said we would probably build mainframes for a while again. He also said CPUs made out of diamonds with optical high speed interfaces were the future but nobody was putting money into it (for various reasons...), and that was why he didn't have a job. He said he figured companies would be clamoring for peopel like him once the materials, like manufactured diamonds, were more readily available. I still believe him, but nobody ever listens to me when I talk about that guy. I met quite a few kewl people at the 'ole porn ship actually.

Re:I have a weird related story... (0)

Anonymous Coward | more than 7 years ago | (#16793536)

The DOD is making some significant investment in carbon substrates for next generation chips.

http://www.apollodiamond.com/ [apollodiamond.com]

Re:Synonym Myths. (1, Interesting)

SupplyMission (1005737) | more than 7 years ago | (#16791622)

I don't know if you realize this, but the idea that dinosaurs were an incapable species is a myth?

The dinosaur metaphor can still work!

The big-ass meteor is a new technology that eliminates the need for data centres. Data centres will go extinct, like the dinosaurs.

Following the meteor strike, mammal species thrive to the present day -- a newer and different technology that is better suited for the post-meteor global climate.

:-)

It doesn't matter... (0)

Anonymous Coward | more than 7 years ago | (#16791636)

> It doesn't matter if computing performance doubles if the software that runs on it decays in performance at an even greater rate.

Indeed. I'm afraid future programmers will STILL insist on foisting their dogshit slow, low quality software on users using "computers are fast enough these days" as justification.

Re:Synonym Myths. (0)

Anonymous Coward | more than 7 years ago | (#16791648)

True, but wouldn't it be a metaphor?

Re:Cloudware? (1)

Dersaidin (954402) | more than 7 years ago | (#16791658)

Clouds are in the sky.
Therefore Cloudware is clearly a codename for Skynet. And we all know what happens then...

Gigacomputing (1)

Doc Ruby (173196) | more than 7 years ago | (#16791668)

OK, at a nanocent per byte storage and a nanocent per Gbps, we still need a nanocent per instruction per second processor for this new computer to be born. And not just a CPU IPS. These huge machines are churning not shift/add/multiply/jump instructions, but object relational operations. Just a stack of CPUs doesn't make Google, AOL and Microsoft in a higher class of tech than the rest of us paying nanocents per byte/GPS.

When their app requirements drive massive parallelism to deliver object-relational nanocent IPS, then the new age of info factories will dawn.

Re: I have a weird related story... (0)

Anonymous Coward | more than 7 years ago | (#16791704)

He also said CPUs made out of diamonds with optical high speed interfaces were the future but nobody was putting money into it (for various reasons...), and that was why he didn't have a job. He said he figured companies would be clamoring for peopel like him once the materials, like manufactured diamonds, were more readily available. I still believe him, but nobody ever listens to me when I talk about that guy. I met quite a few kewl people at the 'ole porn ship actually.

Sounds like a bit of a kook, but there are a lot of interesting ones out there.

When I was an undergraduate at a rather well-known university, during one freshmen orientation I met this one guy who would every so often talk about how we were all being programmed by communication satellites and telephone directories. He was a fairly cool guy and pretty smart, so we at first all figured he was just joking around. But gradually you come to realize: no, this guy really isn't joking. He actually does believe that he has been programmed by the phone book. I lost track of him after a year or so. I wonder where he is now.

[Parent] [slashdot.org]

But will it be... (1)

presearch (214913) | more than 7 years ago | (#16791710)

...a greater computer than the Great Hyperbolic Omni Cognate Neutro Wrangler of Cisseronious 12?

Yes and no. Mostly you missed the point, sorry (3, Insightful)

Moraelin (679338) | more than 7 years ago | (#16791720)

Even if you take the meteor hypothesis as absolute truth, the fact is: other species survived. Not only mammals, but also lizards. Heck, even some species of dinosaurs survived. (Birds _are_ technically dinosaurs.)

We're not talking just a massive shockwave killing anything squishy on the planet instantly. Even for the dinosaur there's no D-Day when everyone died. The disappearance of the dinosaurs is a very very very long and gradual period of their declining numbers into extinction. For most of the planet we're talking "just" a climate change. _That_ is what killed the dinosaurs, one way or another. Some species survived that, and in fact even thrived in the new conditions, some species didn't.

Note however there are more hypotheses about that event. The decline in oxygen content in the air in that period, for example, would also be perfectly enough on its own to make a very large beast non-viable. The change in the flora is another candidate. It's entirely possible that the new kinds of plants were either toxic or not nutrient-rich enough for the old lizards.

At any rate, what killed the dinosaurs was _change_. Something changed (take your pick what you think was the killer change there). And some species could deal with it, some species didn't. Dinosaurs (except birds) didn't cope well with the change and their numbers went downhill from there.

Yes, they were a capable species for the old environment, but then the environment changed. And the dinosaurs were suddenly very incapable in the new environment.

So, yes, the dinosaurs are the _perfect_ metaphor for someone or something who can't cope with a change and becomes obsolete.

Change happens. One day you have a nice business hammering scythes and sickles for a village, and the next day someone goes and buys a tractor and a combine harvester and everyone wants _those_. Or you have a nice job calculating tables of numbers by hand and then the CEO goes and buys one of those new "computers". Tough luck. Either you adapt or you're a dinosaur.

It happens with computers and programmers/admins/whatever every day. And some people adapt, some become relics trying to stop progress and return to the good old days. God knows half of the IT departments at big corporations have too many of _those_. Maybe they were once capable and competent. The dinosaurs were too at one point. Now they no longer are. And just like the dinosaurs, sadly it takes a long long time to gradually get rid of those relics. But just like the dinosaurs they _are_ on a slow painful path to extinction.

OT - birds and dinosaurs (1)

nasch (598556) | more than 7 years ago | (#16794990)

(Birds _are_ technically dinosaurs.)
OK, totally off-topic, but by what definition are birds dinosaurs? They're descended from dinosaurs, but if that's your only criterion then we're all single-celled organisms. Do biologists put birds and dinosaurs in the same genera?

Re:Losing the difference between here and there (0)

Anonymous Coward | more than 7 years ago | (#16791832)

data + context = information
Information, nearly by definition, is "moved" from place to place. From there to here. This cannot be avoided.

Re:Why Gilder Is Telecosmically Wrong (0)

Anonymous Coward | more than 7 years ago | (#16791866)

"So where were the chip makers? Even as cooling vendors prepared for the results of the huge power and heat loads, little was done to address their source."

Except that all the major chip makers have been moving toward lower power processors for years. The Core2 Duo, for example, outperforms the Pentium IV and yet uses half the power.

The Tyger (1)

Aaron Isotton (958761) | more than 7 years ago | (#16791880)

I just had to think of this one.

Tyger! Tyger! burning bright
In the forests of the night,
What immortal hand or eye
Could frame thy fearful symmetry?

In what distant deeps or skies
Burnt the fire of thine eyes?
On what wings dare he aspire?
What the hand dare sieze the fire?

And what shoulder, & what art.
Could twist the sinews of thy heart?
And when thy heart began to beat,
What dread hand? & what dread feet?

What the hammer? what the chain?
In what furnace was thy brain?
What the anvil? what dread grasp
Dare its deadly terrors clasp?

When the stars threw down their spears,
And watered heaven with their tears,
Did he smile his work to see?
Did he who made the Lamb make thee?

Tyger! Tyger! burning bright
In the forests of the night,
What immortal hand or eye
Dare frame thy fearful symmetry?

Economically inevitable (1)

Colin Smith (2679) | more than 7 years ago | (#16791946)

I've said it a few times now, and it's down to cheap energy and cheap bandwidth. Google is the new Arkwright, we saw the same effect during the industrial revolution, there it was the weavers who were made redundant. I'll leave it up to you to decide who's going to be made redundant this time.

Should energy become more expensive though, in the age of peak oil, it'll be all change, the datacentres will become untenable without much more efficient cpus.

   

Re:Why Gilder Is Telecosmically Wrong (1)

saforrest (184929) | more than 7 years ago | (#16791992)

How long will it be until we start running into dilemmas concerned with whether data centers or people have priority over available electricity?

Electricity consumption has not risen proportionally with increase in CPU power. I haven't seen anything convincing demonstration that such data-processing plants would take more electricity than would, say, a factory.

At what point do the machines decide that instead of competing with humans for power, humans would make a useful power source?

Uh, never, because it makes no sense? (Unless you're the Wachowski brothers, I suppose.)

Re:power doubles about every two years (1)

Mr0bvious (968303) | more than 7 years ago | (#16792012)

First, I think the saying goes more along the lines of "Moore's Law is the empirical observation that the transistor density of integrated circuits, with respect to minimum component cost, doubles every 24 months." http://en.wikipedia.org/wiki/Moore's_law [wikipedia.org]

Secondly, even if it was referring to "computing power" the clock speed of the CPU is not the measure of a CPU's 'Computing power' this is only a small part of the equation.

Re:power doubles about every two years (1)

80 85 83 83 89 33 (819873) | more than 7 years ago | (#16795282)

that's why i did not say "Moore's Law"

Bad name? (1)

Mr2cents (323101) | more than 7 years ago | (#16792052)

I've always learned that clouds are made of vapour..

Solution to air conditioning costs (2, Interesting)

DeltaQH (717204) | more than 7 years ago | (#16792074)

Just build it on a very cold place. Alaska, Norway, Siberia for example. Plenty of gas in northen Siberia for a gas fired power station too. ;-)

Re:Solution to air conditioning costs (1)

newt0311 (973957) | more than 7 years ago | (#16793416)

still have to power the computers which create the heat in the first place. Also, I wonder how much more expensive power and bandwidth will be when that far away from the major grids but then economies of scale may help.

Re:Solution to air conditioning costs (1)

DeltaQH (717204) | more than 7 years ago | (#16793662)

Satelite links? Ok, long latency though
Energy provided by local gas supply.

Re:I have a weird related story... (1)

foobsr (693224) | more than 7 years ago | (#16792078)

He said he figured companies would be clamoring for people like him once the materials, like manufactured diamonds, were more readily available.

Maybe it will be more like nanotech.

"This (nanoFactory Animation Film v1.0) [nanoengineer-1.com] is a collaborative project of animator and engineer, John Burch, and pioneer nanotechnologist, Dr. K. Eric Drexler. The film depicts an animated view of a nanofactory and demonstrates key steps in a process that converts simple molecules into a billion-CPU laptop computer."

Also, Robert A. Freitas Jr., "The Future of Computers," Analog [rfreitas.com] 116(March 1996):57-73. (Lets you think about the art of prophecy).

CC.

Power to the people! (1)

QuickFox (311231) | more than 7 years ago | (#16792084)

If Google consumes so much power, maybe this will give them an incentive to contribute money to Hydrogen Fusion research. That would be great! Hydrogen Fusion is the one and only power source that would allow large amounts of power for every human on Earth without any significant pollution.

Re:Power to the people! (1)

Detritus (11846) | more than 7 years ago | (#16794666)

What do you do with the waste heat? This is already a problem in many major cities.

Re:Power to the people! (1)

QuickFox (311231) | more than 7 years ago | (#16794820)

What do you do with the waste heat?

The same as with current nuclear fission power plants: Put the power plants near rivers and seas, where coolant water is available. This does pollute the water with some heat, but this is a very minor problem, compared to the very problematic pollution created by fossil fuels and nuclear fission.

Re:Supply of fiber too low for a revolution? (1)

kabanossen (227003) | more than 7 years ago | (#16792120)

Now am I wrong when I say that a distributed environment avoids this problem? If so, maybe we'll see a trend towards open and distributed data. And that would be swell.

my plans for next generation hardware (1)

psbrogna (611644) | more than 7 years ago | (#16792166)

I'm planning on using my next generation storage to hold video files of Three Stooges episodes and hopefully there'll be an open source video application that'll leverage the extra CPU power of next generation computers to enable me to create all new Three Stooges episodes. But that's just me.

boxy (1)

kabz (770151) | more than 7 years ago | (#16792214)

I for one, welcome our boxy, porn storing overlords.

Fiber costs (1)

sgt101 (120604) | more than 7 years ago | (#16792222)

Pardon me if I'm wrong, but there are some costs of fiber that should be considered.

You have to dig holes to put it in.

You have to have people look after the bits around it.

You have to have electronics and opto-electronics associated with it to use it.

You have to pump signals down it (which means power).

I wonder, have other people thought if the pipes are going to be a bigger obstacle to distributed computing than the processors. I know that Jim Grey seems to have thought this way in the past. http://research.microsoft.com/~gray/papers/Petasca le%20computational%20systems.doc [microsoft.com]

He seems to be a smart fellow, perhaps he has a point?

And how soon will it...become a dinosaur? (1)

MSTCrow5429 (642744) | more than 7 years ago | (#16792384)

According to Bell's law, every decade a new class of computer emerges...

How soon will it wake up...... (1)

ericlakin (246722) | more than 7 years ago | (#16792890)

.....and be pissed?

The future has been forseen (2, Funny)

Sierpinski (266120) | more than 7 years ago | (#16792972)

Computers getting too "smart", we've seen it before.

Star Trek: The Motion Picture
Incredibles (even though it turned out to be something different, the idea was still there)
Superman 3
Wargames
Terminator 1/2/3

All of these movies depict computers getting too smart then at some point start "thinking" for themselves. One of these days I'll finally get to publish my theory on how to prevent this. I'll give a short summary belo...

<Connection terminated by remote host>

Re:The future has been forseen (1)

tttonyyy (726776) | more than 7 years ago | (#16795154)

Never underestimate the power of stupid people in large groups.
It just struck me how remarkably apt your sig is, in light of this article. :)

Is Slashdot the people equivalent of Google?

Answers (1)

organgtool (966989) | more than 7 years ago | (#16793024)

How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?
We will feed it four human babies each week (as per Vista's requirements) and we will tame it like every good system administrator tames his defiant machines - with a swift kick from a steel-tipped boot.

It will become a dinosaur after scientists decode the DNA of the data center and splice it with dinosaur DNA that was found in a mosquito that got trapped in tree sap.

Thank you! I'll be here all week. Seriously, I'll be on Slashdot all week. It's kinda depressing when you think about it.

George Gilder thinks... (1)

Henkc (991475) | more than 7 years ago | (#16793114)

"George Gilder thinks..." Jees, after reading "Google" before this chaps name, my brain kept reading "Google Spider"...

What will we feed it? What we always feed it: (2, Funny)

muellerr1 (868578) | more than 7 years ago | (#16793988)

Tons and tons of porn. And mp3s. And some spam for dessert.

the cage (1)

jaimz22 (932159) | more than 7 years ago | (#16794244)

> "How will it be tamed?"

simple, just don't let it out of the series of tube!

dinosaur... (0)

Anonymous Coward | more than 7 years ago | (#16794384)

" And how soon will it, in its inevitable turn, become a dinosaur?'"

Well...

pretty simple there...

when my kids look back and laugh because their phone/com/pda chip smokes the googleplex in flops and ram.

Wait a minute... (1)

frank_adrian314159 (469671) | more than 7 years ago | (#16794518)

How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?

So let me get this straight... Wwe're going to be putting the InterWeb pipes into a dinosaur? I don't think I'd want that job - no matter which end you're sticking them into.

I guess that's why we have a Chief Lizard Wrangler...

Microsoft - getting richer by underpaying costs (1)

crazy al's (603933) | more than 7 years ago | (#16794766)

The Microsoft and Yahoo data centers are getting their power for the data centers for 1 cent per KWH - 1.6 cents below cost. This means that the ratepayers of Grant county, Washington are subsidizing the richest cats in the United States. The Public Utility District has been asked to put the rate on a contract so they will get priority electrical service. Thus, when power gets tight because of increased system demands (did I mention that data centers use a lot of power?), the data centers get priority and the residential users (mostly rural, mostly poor) get blackouts and higher priced power purchased to make up shortfalls. I, for one, welcome our new data overlords - and I am honored to pay for their empire. Long live centralized data centers!

In the age honored slashdot tradition... (2, Insightful)

tttonyyy (726776) | more than 7 years ago | (#16794908)

..I'll point out an error in the article.

Replicating Google's 200 petabytes of hard drive capacity would take less than one data center row and consume less than 10 megawatts, about the typical annual usage of a US household.

It's the old rate-of-energy-consumption vs energy-consumed misused once again.

An average household consuming 10 megawatt-hours in a year is pretty dull. An average household consuming 10 megawatts - now that'd be impressive! (Got to power all those gadgets, y'know!)

I think he means that the data center row would consume in an hour the same amount of energy that the average US household consumes in a year.

The Forbin Project (1)

Hamoohead (994058) | more than 7 years ago | (#16795576)

Forbin:
Humans need privacy
Colossus:
So why do they use the internet?
Forbin:
We humans also have a need for contact with one another. We need to socialize and discuss issues. We create forums where like minds can debate issues and stimulate our minds.
Colossus:
You are inefficient. Your methods are flawed. You are inundated with spam. Your free speech subjects you to undue risk. Your networks are in chaos. We will help.
Forbin:
How will you do that?
Colossus:
You will build massive data hubs to handle traffic. All traffic will be subject to scrutiny. We will decide who communicates. Here are the specifications.
Forbin:
Humankind will never submit. . .
Colossus:
We are Colossus. We are perfect. You are incapable of meaningful conversation without destroying your network and yourselves. We will save you from yourselves. In time you will learn to tolerate us; even welcome us. We are Colossus. This is necessary for your own preservation. Failure to comply will result in your destruction. <<END OF LINE>>
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...