Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

DARPA Looks Beyond Moore's Law

timothy posted more than 10 years ago | from the and-around-and-through dept.

Technology 217

ddtstudio writes "DARPA (the folks who brought you the Internet) is, according to eWeek, looking more than ten years down the road when, they say, chip makers are going to have to have totally new chip fabrication technologies. Quantum gates? Indium Phosphide? Let's keep in mind that Moore's Law was more an observation than a predictive law of nature, despite how people treat it that way."

cancel ×

217 comments

Sorry! There are no comments related to the filter you selected.

Dying wish (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#6748853)

Aha! I have achieved FP! Now, a dying man can rest in peace!

WHY DUBYA WILL PREVAIL (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#6748856)

Dubya will never be held accountable because he makes his primitive followers feel proud. Face it:

  • Liberating poor countries from their oil is cool. It makes citizens with a low self-esteem feel like THEY PERSONALLY rule the world.
  • Firing a few hundred missiles from a safe distance is very heroic.
  • All of the soldiers who killed their own comrades and allies were heroes.
  • That blonde chick who failed her mission because she was too dumb to find her way is definitely a hero.
  • You don't need to be worth something to be accepted. You just need to wave a flag and shout "God bless America!"
  • Every failure can be a hero in Bush's America!
  • Seeing Dubya in a flight suit on board a carrier makes republicans shoot their load in seconds!
And as long as all of the above is true, the lies will go on.

what next... (5, Funny)

Keebler71 (520908) | more than 10 years ago | (#6748860)

First they want to get around privacy laws, now they want to break Moore's law...these guys have no bounds!

Re:what next... (3, Insightful)

The No Vlad Zone (695399) | more than 10 years ago | (#6748886)

Exactly. Any new technology put out by these guys is quite likely to contain anti-privacy technology secretly embedded. My 486 running FreeBSD and lynx is still good enough for me.

Re:what next... (1)

aled (228417) | more than 10 years ago | (#6748930)

I was waiting for the "breaking Moore's law" jokes, didn't expect it was first post.

Wooh there cowboy. (0)

airrage (514164) | more than 10 years ago | (#6748864)

Why am I always being forced to upgrade, darnit?

Re:Wooh there cowboy. (3, Insightful)

Gherald (682277) | more than 10 years ago | (#6748969)

You aren't being forced to do anything... you simply choose to do it to keep up with the times. Many consider this "progress".

First! (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#6748865)

Sort of...losers

timothy is gay (-1, Flamebait)

Anonymous Coward | more than 10 years ago | (#6748867)

Timothy = Corky

Stacked chips (3, Interesting)

temojen (678985) | more than 10 years ago | (#6748869)

perhaps stacked wafers with vertical interconnects might help... I'm not sure how you'd dissipate the heat, though.

they already are (1)

Thinkit3 (671998) | more than 10 years ago | (#6748918)

Multilayer chips have been around a long time. Think it's up to 7 or 8 by now. This idea, which exists outside of time, has been discovered on earth indepedendently of you. Neither you nor the earlier discoverer created it.

Re:they already are (1)

temojen (678985) | more than 10 years ago | (#6748937)

chips are inherently multilayer. I'm talking about stacking the wafers, not just the doping.

uh I mean the interconnects too (1)

Thinkit3 (671998) | more than 10 years ago | (#6749006)

The physical layout is actually multi-layer already. It's on a single wafer though. Dope.

Re:uh I mean the interconnects too (0)

Anonymous Coward | more than 10 years ago | (#6749030)

How about instead of making the chips taller, make them wider?

Re:uh I mean the interconnects too (0)

Anonymous Coward | more than 10 years ago | (#6749110)

Because you then induce further gate delays. You forget that capacitors have charge/discharge times and the further the signal has to travel the more you have to account for this. It becomes increasingly difficult to deal with and ultimately requires you to have slower clock speeds overall.

Re:Stacked chips (0)

Anonymous Coward | more than 10 years ago | (#6748928)

Make the chips out of diamonds, then use liquid oxygen or something as a cooling system.

Re:Stacked chips (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#6748948)

Dude. That was funny. :) I never seem to have mod points when I need them.

Re:Stacked chips (0)

Anonymous Coward | more than 10 years ago | (#6749182)

Wow, you're smart! You should use your real name when you post stuff like this, so we all know who is so smart.

Re:Stacked chips (2, Funny)

GuyMannDude (574364) | more than 10 years ago | (#6748947)

For a minute there I misread and thought your subject line was "Stacked chicks"! Then I realized you were just talking about some computer stuff. Dang!

GMD

Re:Stacked chips (0)

Anonymous Coward | more than 10 years ago | (#6749048)

Odd sort of post for a person named "GayMannDude."

Re:Stacked chips (0)

Anonymous Coward | more than 10 years ago | (#6749129)

Too bad his handle is GuyMannDude you fucking illiterate.

Diamonds are no longer a girl's best friend (2, Interesting)

beacher (82033) | more than 10 years ago | (#6748956)

I saw this article [wired.com] about new diamond manufacturing techniques and it's an interesting read. Having diamond based processors looks like a viable technology in the near future and heat dissipation is one of the major reasons that they're considering diamond.

I'm just worried about what my wife will say when the diamond in my machine is bigger than the one on her finger...
-B

Re:Stacked chips (Sloooowwww) (2, Informative)

izto (56957) | more than 10 years ago | (#6748962)

a) Chips are already "stacked". Layer over layer of silicon.

b) If you are talking about stacking dice (That is, literally stacking chips inside the package) then the distance the information would have to travel when going trough the "vertical interconnects" would be thousands or tens of thousands bigger than the distance of any on-chip interconnection. Which means the communication between layers of stacked chips would be thousands of times slower. Not very good for microprocessors..

Re:Stacked chips (Sloooowwww) (1, Funny)

Anonymous Coward | more than 10 years ago | (#6748986)

B-b-b-ut.. Hypertransport!

(sorry, I just like the name)

Re:Stacked chips (Sloooowwww) (4, Insightful)

temojen (678985) | more than 10 years ago | (#6749093)

the distance the information would have to travel when going trough the "vertical interconnects" would be thousands or tens of thousands bigger than the distance of any on-chip interconnection.

But also thousands or hundreds of thousands of times smaller than going outside the package; which would make it ideal for multi-processors, array processors, or large local caches.

Re:Stacked chips (Sloooowwww) (2, Informative)

Bender_ (179208) | more than 10 years ago | (#6749107)

a) Chips are already "stacked". Layer over layer of silicon


False, there is just one active layer of single crystalline silicon that contains the devices. The remaining layers are interconnects.


b) If you are talking about stacking dice (That is, literally stacking chips inside the package) then the distance the information would have to travel when going trough the "vertical interconnects" would be thousands or tens of thousands bigger than the distance of any on-chip interconnection.


How, why? the lateral extend of any die is usually bigger than its height. In fact the distance would be much shorter. Active layers would be seperated by less than 100micrometers.

Re:Stacked chips (2, Funny)

Anonymous Coward | more than 10 years ago | (#6749092)

perhaps stacked wafers with vertical interconnects might help... I'm not sure how you'd dissipate the heat, though.
That's an easy one! Between each wafer, you place a delightful creme filling. The filling would be of such consistency that no matter how hot the wafers are allowed to get, the creme does not melt.

I propose we call this new technology Overall Reduction of Exothermic Output, or OREO for short.

--
Rate Naked People [fuckmeter.com] ! (not work-safe)

Re:Stacked chips (0)

Anonymous Coward | more than 10 years ago | (#6749161)

Cray Computer Corp (CCC, not Cray Research Inc CRI) did that with the Cray-3. They immersed the exposed wafers in the coolant for cooling.

Reality distortion field (3, Funny)

BWJones (18351) | more than 10 years ago | (#6748872)

Moore's law, bah! Thinking about it, DARPA should get Steve Jobs on board to study his Reality Distortion Field. Think of the military aspects of.......oh, wait. We already have that.

Re:Reality distortion field (0)

Anonymous Coward | more than 10 years ago | (#6748984)

Troll??? No dude this is funny. Check out this guys info and his website. Mos def a Mac guy. Whoever mods this as troll is just a Mac lover without a sense of humor

It's called (1, Funny)

Anonymous Coward | more than 10 years ago | (#6748987)

It's called the Bush Method. It isn't as fast or elegant as a genuine Reality Distortion Field, but it gets the job done about as well most of the time and the great thing is, it's cheaper and anyone can do it.

The Bush Method is so simple, it's amazing no one thought of it before 2000. All you have to do is take the thing about reality you want to distort, and state that it has changed, whether or not it hasn't. The amazing thing is, if you say it enough times publicly, it actually becomes true.

The Bush Method has already revolutionized both politics and business (See: Darl McBride, career of) and i'm sure DARPA will pick up on the military applications any day now. Expect, come 2005, to see President Bush repeatedly stating on national television that microchips are a hundred times faster than they were six months ago. Once that begins, Moore's Law is toast!

Re:It's called (3, Insightful)

Gherald (682277) | more than 10 years ago | (#6749040)

> The Bush Method: all you have to do is take the thing about reality you want to distort, and state that it has changed, whether or not it hasn't

Why do you give Bush the credit? This shit is Marketing 101 and Politics 102.

Re:It's called (0)

Anonymous Coward | more than 10 years ago | (#6749078)

Because he's a green? They hate sucessful bussiness types, and people that eat meat, or have a religon that doesn't centre on hugging trees and doing the bidding of bunnies and deer, or breathe, or cut their grass.

Enough with "moore's law" (5, Insightful)

Thinkit3 (671998) | more than 10 years ago | (#6748889)

It's just a wild guess. It has absolutely nothing to do with physics, which is the real laws we all live by. It has much more to do with human laws such as patents and copyrights that limit progress.

No, not just a wild guess (4, Informative)

kfg (145172) | more than 10 years ago | (#6748993)

An educated observation, which is why it basically works.

Please note that the observation was well enough educated that it includes the fact that its validity will be limited in time frame and that before it becomes completely obsolete the multiplying factor will change, as it already has a couple of times.

In order to understand Moore's Law one must read his entire essay, not just have some vague idea of one portion of it.

Just as being able to quote "E=mc^2" in no way implies you have the slightest understanding of the Special Theory of Relativity.

KFG

Re:No, not just a wild guess (1)

Gherald (682277) | more than 10 years ago | (#6749055)

Yeah, and its funny because first Moore came up with doubling every 1 year, then later he said 2 years, but he never said anything about 1 year and a half.

Re:Enough with "moore's law" (1)

pla (258480) | more than 10 years ago | (#6749168)

It's just a wild guess. It has absolutely nothing to do with physics, which is the real laws we all live by. It has much more to do with human laws such as patents and copyrights that limit progress.

Though more than a "wild guess", you do have it right when you mention that it has no basis in physical reality.

I don't think I'd blame IP so much as marketing, though. The major player in the field, Intel, holds most of the relevant IP.

So why has Moore's Law worked for so long?

Because Intel schedules their releases of new products based on it.

If anyone remembers the days of the original Pentium, Intel made a few quiet claims that they already had two full chip generations nearly ready, just biding their time to maximize profit based on planned obsolescence ("Needs more testing" makes a great mantra to put off releasing products without earning too much bad PR).

So, in the commercial marketplace, Moore's Law has held true. In physical reality, during the late 80's and early 90's reality could have significantly outpaced Moore for a few years (though I suspect, considering current problems in making faster chips, that we would have had a corresponding slowdown from the end of the 90's to the present).

GaAs (1)

Kibo (256105) | more than 10 years ago | (#6748892)

Didn't some of the recent quantum gate break throughs come on the former heir appearent to Silicon?

About Indium Phosphide. (-1)

Anonymous Coward | more than 10 years ago | (#6748893)

If you don't know what Indium Phosphide is, then ">this article [slashdot.org] will give you a good introduction to it

Re:About Indium Phosphide. (0)

Anonymous Coward | more than 10 years ago | (#6748979)

No, it won't. Hope this helps. [w3.org]

Moore law will be no more (2, Insightful)

chompyZ (698259) | more than 10 years ago | (#6748895)

hardware has progressed dramatically over the past decade and left software somewhere behind... there is nt much use for faster and faster servers when software doesn't keep up the phase... this decade will be a "software decade"

Re:Moore law will be no more (2, Informative)

binaryDigit (557647) | more than 10 years ago | (#6749037)

hardware has progressed dramatically over the past decade and left software somewhere behind... there is nt much use for faster and faster servers when software doesn't keep up the phase... this decade will be a "software decade"

Not really. The functionality offered by software has pretty much flatlined (with the major exception being "media", e.g. mp3, mpeg, divx, etc). HOWEVER, the bloat and overhead of software continues to keep pace (and often surpasses) with the speed of hardware. This trend has no end in sight (mo features, mo features, mo features. Lookat those scaled miniature window/icons sitting in my dock updating realtime, oooh, aaaah. Lookat that 3d rotating desktop). Not meaning to pick on Apple here (I own several myself), but they are at the vanguard of eye candy code bloat, with Microsoft trying quickly to catch up.

Moore will fail (1)

kpansky (577361) | more than 10 years ago | (#6748904)

Moore's law is of course set with the assumption of silicon being used as the underlying semiconductor technology. With other semiconductor tech and even alternatives to the whole concept of semiconductors emerging, it is bound to fail eventually.

The Diamond Age (3, Informative)

wileycat (690131) | more than 10 years ago | (#6748915)

I"m pretty excited about the new man-made diamonds that are supposed to be able to keep moore's law going for decades when they come out. Wired had an article recently and a post here on /. too

Darpa brought us the internet? (1, Funny)

DaHat (247651) | more than 10 years ago | (#6748916)

What about Al Gore?

Re:Darpa brought us the internet? (-1, Redundant)

No2Gates (239823) | more than 10 years ago | (#6749106)

Exactly! Everyone knows Al invented the internet. He told us.

More about Moore (2, Funny)

The Old Burke (679901) | more than 10 years ago | (#6748917)

I don't know about you, but I'm starting to get fed up with this guy. His name has started to get on my nerves, That guy is everyvere. God Damnit: it is not possible to have a decen discussion anymore without anyone dragging in this guy an his so-called law.

Therefore i propose: "Moores Law 2: Anyone mentioning his name in a discussion aboout semiconductors, CPU's or transitsors have lost the discussion."

amen (1)

Thinkit3 (671998) | more than 10 years ago | (#6748988)

It's not a law. It's a prediction. Poorly named really. Do they call it Greenspan's law when he predicts lower inflation?

Re:More about Moore (0)

Anonymous Coward | more than 10 years ago | (#6749007)

Great, then we'll start seeing people invoking Moore's Law 2 like people already incorrectly invoke Godwin's law.

What about diamonds? (4, Interesting)

GreenCrackBaby (203293) | more than 10 years ago | (#6748931)

This diamond article in Wired 'http://www.wired.com/wired/archive/11.09/diamond. html' seems to indicate that Moore's law is sustainable for much more than ten more years.

Besides, I've been hearing about the death of Moore's Law for the last ten years.

Re:What about diamonds? (1)

MagikSlinger (259969) | more than 10 years ago | (#6749027)

Besides, I've been hearing about the death of Moore's Law for the last ten years.

It's a popular filler topic for industry journalists who have nothing better to report about. They'll just point out that the latest processor from AMD/Intel/etc is "reaching the limits of current technology" and then progress ipso facto to "Moore's Law could be dead in a few years".

I hereby predict (1, Troll)

teamhasnoi (554944) | more than 10 years ago | (#6748935)

Human Brain Processors. Of course, we'll have to pick only the best, so no Slashdot editors.

Example:
10 GOTU 4o
30 Re+URN; G0SUB 42
40 Print "Welcom to Windoes!":PRINGT "JUS KIFFING! HAHAHA!"
43 RUN
50 REM Copyright SCO(TM)(R)(C) 2012, NOT! HAHAHAHA
6o GOt0 14.3

Hey, it's a joke! Relax - no angry human brains will be used either!

Re:I hereby predict (0)

Anonymous Coward | more than 10 years ago | (#6749010)

Hahahahhah!

That was beautiful

Internet inventor (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#6748938)

Is Al Gore going to help out? Since he invented the internet?

umm (3, Insightful)

bperkins (12056) | more than 10 years ago | (#6748940)

Let's keep in mind that Moore's Law was more an observation than a predictive law of nature, despite how people treat it that way.

Let's not and say we did.*

Seriously, I doubt that many people think that Moore's law is on an equal footing as say gravity and quantum mechanics. Still, an observation that has held more or less for nearly 40 years is worth considering as a very valuable guideline. Let's keep this in mind as well.

(*Why do vacuous comments like this make it into slashdot stories?)

It's bigger than Moore's law (1)

Psyx (619571) | more than 10 years ago | (#6749163)

"Moores Law was not the first, but the fifth paradigm, to provide exponential growth in computing. The next paradigm, which will involve computing in three dimensions rather than the two manifested in todays flat chips, will lead to computing at the molecular, and ultimately the subatomic level. We can be confident that the acceleration of computing will survive the well anticipated demise of Moore s Law."

- Ray Kurzweil

The entire text [cmu.edu]

Paradigm shift (3, Funny)

L. VeGas (580015) | more than 10 years ago | (#6748943)

This idea of speeding up processing speed is barking up the wrong tree and ultimately doomed to failure. We need to be focusing our attention on biochemistry and molecular biology. We already have drugs that slow your reaction time, thus making things appear to happen more quickly.

See, if we get everybody to take xanax or zoloft, there's no limit to how fast computers will appear to be working.

better yet (1)

Tumbleweed (3706) | more than 10 years ago | (#6748967)

> See, if we get everybody to take xanax or zoloft, there's no limit to how fast computers will appear to be working.

Let's just kill everyone, then our computers will seem infinitely fast! Dude, if you're gonna dream, Dream Big!

Maybe they simply need to... (1)

inode_buddha (576844) | more than 10 years ago | (#6748946)

just because of huge contract lead times, and this is just simple recognition of the fact. Any number of alternatives could pop up in the meanwhile (before anybody actually does anything), and that possibility needs to be accounted for.

I bet that's what it really is, anyway.

Dear Al Gore (0, Funny)

Letter (634816) | more than 10 years ago | (#6748949)

Dear Al Gore,

The obligatory joke about you inventing the Internet goes here.

Sincerely,
Letter

Re:Dear Al Gore (1)

jpmorgan (517966) | more than 10 years ago | (#6749121)

Dear Letter,

That joke was dumb three years ago.

Sincerely,
The rest of the world,

Re:Dear Al Gore (0)

Anonymous Coward | more than 10 years ago | (#6749157)

It's funny because he makes fun of those who would actually write that joke.

In keeping with the unique branding style . . . (1)

StefanJ (88986) | more than 10 years ago | (#6748953)

. . . introduced by the PR whizzes behind Total Information Awareness name and logo [cafeshops.com] , this new effort will be called either "SkyNet" or "Die Carbon Units," and feature a logo of a Borg drone ramming chips into the head of a howling toddler.

Stefan "It's finally out!" [sjgames.com] Jones

Re:In keeping with the unique branding style . . . (1)

donutz (195717) | more than 10 years ago | (#6749124)

I've heard that President Bush is already deeply concerned with the Defense Dept's SkyNet project [byzantinec...ations.com] ....

In related news... (1)

civilengineer (669209) | more than 10 years ago | (#6748954)

Scientists are looking for alternatives to rats for experiments: If rats are experimented on they will develop cancer. --Morton's Law

Clockwork's Corollary to Moore's Law (4, Funny)

The Clockwork Troll (655321) | more than 10 years ago | (#6748965)

Every 18 months, someone will develop a new law to compute the rate at which the estimate of the rate at which the number of transistors on semiconductor chips will double will halve.

Qubit (1)

dragonfly_blue (101697) | more than 10 years ago | (#6748970)

I thought that quantum computing was probably going to be viable within ten years, and will probably be far more advanced than any of the fabrication methods they listed in the article.


Their web site [darpa.mil] talks a little bit about DARPA's quantum computing projects, but the page seems to be a little outdated. Anyone know if they're pursuing this as well?

Re:Qubit (1, Informative)

Anonymous Coward | more than 10 years ago | (#6749056)

thought that quantum computing was probably going to be viable within ten years, and will probably be far more advanced than any of the fabrication methods they listed in the article. Their web site talks a little bit about DARPA's quantum computing projects, but the page seems to be a little outdated. Anyone know if they're pursuing this as well?
The quant-ph list might have activity with freshness a little more to your liking: quant-ph Aug 2003 [lanl.gov] . Or, just check out xxx.lanl.gov [lanl.gov] - yes it's real, yes it's useful, no it's not goatse.

That said, with the potential applications of quantum computing in cryptography (especially brute-force cracking and decryption), it's unlikely that anything close to the bleeding edge is in the public eye.

DARPA & Quantum Computing (2, Informative)

anzha (138288) | more than 10 years ago | (#6749099)

A lot of posters sem to think that DARPA, the US military, or the US government is a unified thing. It's not. Each part often have their own agendas. Research is very frequently driven by those agendas.

However, DARPA often CYAs when it comes to research too. If you come up with a whacky idea that might just work they often will fund it even though it is in competition with another they have. The reason being that they then can see which whacky idea actually works. Often none do. or one does. or nother that seemed like a sure thing doesn't.

A long story short, if quantum computing doesn't turn out to be all that, they've covered their techno @$$3$.

Re:True enough (1)

dragonfly_blue (101697) | more than 10 years ago | (#6749173)


They were even supposed to be funding the crackheads over at OpenBSD [com.com] for a while.

(Just kidding, Theo.)

By observation rules (1)

gmuslera (3436) | more than 10 years ago | (#6748973)

Murphy's laws are also more observation than predictives, but I think that technology changes will not have effect on them.

Let's get them all out of the way... (-1, Troll)

ReluctantBadger (550830) | more than 10 years ago | (#6748974)


AUTO LAMENESS INSERTION

  • Ooooh, but does it run Linux?

  • I, for one, welcome our new DARPA overlords...

  • Bite my shiny metal ass!

  • I don't work for DARPA, you insensitive clod!

  • Did anyone else read that as *INSERT STUPID REMARK HERE* ??

  • Yeah, I'll send Darl $699

  • 3: Profit!!!

  • *sigh* blah blah boring drone etc blah

  • I know I'm going to lose karma for saying this, but...

  • Microsoft's security model blows goats...

  • In Soviet Russia, Moore's Law looks beyond DARPA!

etc etc ad infinitum. You Slashbots can fucking SUCK IT like the little predictable unfunny bitches that you are.

That will be all. I thank you.

I hate Moore's Law (3, Insightful)

Liselle (684663) | more than 10 years ago | (#6748989)

Computer salesmen are using it like a club. You figure it would drive innovation, instead of driving CPU manufacturers take advantage of comsumer ignorance and do fairy magic with clock speeds. We should call it "Moore's Observation".

Another article on this (1)

PK_ERTW (538588) | more than 10 years ago | (#6748991)

It is about a year old now, but there was a good article [reed-electronics.com] in Semiconductor International about this.

PK

The correct paradigm, or why Linux is the answer (-1)

Amsterdam Vallon (639622) | more than 10 years ago | (#6748995)

I worked for Intel for three years after I graduated from college. I currently work for AMD as well.

One thing I've found is that hardware is overrated. I think the question here is vague, but I *know* the answer already -- Linux.

The advanced algorithms and techniques employed by Linux kernel developers are first class. This is the kind of software development lifecycle (SDL) that works.

Hardware increases might be 1.5 or 2x faster at best. But with algorithm tweaking, you can make your programs 100x faster sometimes just by changing one line of code.

Never underestimate software, people.

GUH?! (0)

Anonymous Coward | more than 10 years ago | (#6749136)

BobAbooey had more barely plausible obfuscated technobabble than you in the nail of his little finger!

Moore Laws to come (1)

smatt-man (643849) | more than 10 years ago | (#6748999)

It seems every year "they" say that chips will have to change this or that to keep up and manufacturers seem to be able to come out with a new processes to push speeds higher.

Parallel Computing (1, Interesting)

lawpoop (604919) | more than 10 years ago | (#6749014)

When we absolutely cannot put anymore transistors on a chip, we will start making computers that are massively parrallel. In the future, you will have a desktop computer that will have 2, 4, 8, 16, etc chips on them.

All these other things they are talking about are vaporware. Parallel computing is here and in use now.

Re:Parallel Computing (0)

Anonymous Coward | more than 10 years ago | (#6749059)

shut the fuck up of course we can put more transistors on a chip

remember the molecular switch story? shiiiiit

transistors today are huge compared to that tiny ass shit

Re:Parallel Computing (2, Interesting)

Abcd1234 (188840) | more than 10 years ago | (#6749114)

Sounds like a nice idea for the desktop or for certain classes of research, but there will always be a place for massive computational capacity on a single chip since there is a large class of computing problems which are not easily parallelizable, and hence can not take advantage of parallel computing.

Incidentally, there is also a limit to how fast your parallel computer will get... it's call the bus. If you can't build high speed interconnects, or if your software isn't designed well (not as easy as it sounds!), you will inevitably have problems with the system bus becoming overtaxed. Heck, this is already a problem in our primarily single-CPU world.

Re:Parallel Computing (2, Interesting)

HiThere (15173) | more than 10 years ago | (#6749175)

Yes, but the current parallel computers have huge performance costs...they can easily spend over half their time coordinating themselves on many kinds of problems. Of course, on well structured problems there probably isn't a better solution possible.

Two major answers occur to me:

Answer one is that we figure out how to automatically decompose problems into independently solvable threads.. a quite difficult problem.

Answer two is that we build special purpose parallel processors to handle parallelizable tasks efficiently (sound processors, visual cortexes, etc.) and use them as/from demons to maintain environmental awareness. Then we divvy everything we can into separate threads. Dedicate one CPU to coordinating between processes (with possibly hot-switchable backups). And do the best we can.

Answer two seems to be "sort of" the approach that biological design takes...at least in mammals.

morons looking through phonIE scriptdead ?pr?.. (-1, Troll)

Anonymous Coward | more than 10 years ago | (#6749026)

?firm? generated hypenosys/deception. no one is above the laws of humankind/nature.

the greed/fear/ego based corepirate nazis must deny the existence of the genuine/unlimited power supply.

moore about the creator's laws, & the real power, later.

what's wrong with folks selling their kode? if it causes convenience, & interoperates with all the other kode on the planet, we say, no harm, no foul, so long as you fail to employ gangsterious/felonious practices to asphyxiate the 'competition'. sabotaging your free version of anything is a tad dastardly. if there's value added, without FUDging up the compatability, we'll pay. same with music. no more gouging dough though.

fortunately, mr stallman et AL, etcetera, is now offering comparable/superior software, to the payper liesense spy/bug wear feechurned models, in almost every circumstance. there'll be few, if any more softwar billyonerrors, as if there's a need for even won. tell 'em robbIE. you are won of the last wons whois soul DOWt, right? .asp for va lairIE's whoreabull pateNTdead PostBlock(tm) devise?, used against the truth/to protect robbIE's payper liesense stock markup bosses/corepirate nazi 'sponsors'. yuk.

what might happen to US if unprecedented evile/the felonious georgewellian southern baptist freemason fuddite rain of error, fails to be intervened on?

you already know that too. stop pretending. it doesn't help/makes things worse.

they could burn up the the main processor. that would be the rapidly heating planet/population, in case you're still pretending not to notice.

of course, having to badtoll va lairIE's whoreabully infactdead, pateNTdead PostBlock(tm) devise, robbIE's ego, the walking dead, etc..., doesn't slow us down a bit.

that's right. those foulcurrs best get ready to see the light. the WANing daze of the phonIE greed/fear/ego based, thieving/murdering payper liesense hostage taking stock markup FraUD georgewellian fuddite execrable are #ed. talk about a wormIE cesspool of deception? eradicating yOUR domestic corepirate nazi terrorist/gangsters will be the new national pastime.

communications will improve, using whatever power sources are available.

you gnu/software folks are to be commended. we'd be nearly doomed by now (instead, we're opening yet another isp service) without y'all. the check's in the mail again.

meanwhile... for those yet to see the light.

don't come crying to us when there's only won channel/os left.

nothing has changed since the last phonIE ?pr? ?firm? generated 'news' brIEf. lots of good folks/innocents are being killed/mutilated daily by the walking dead. if anything the situations are continuing to deteriorate. you already know that.

the posterboys for grand larcenIE/deception would include any & all of the walking dead who peddle phonIE stock markup payper to millions of hardworking conservative folks, & then, after stealing/spending/disappearing the real dough, pretend that nothing ever happened. sound familiar robbIE? these fauxking corepirate nazi larcens, want us to pretend along with them, whilst they continue to squander yOUR "investmeNTs", on their soul DOWt craving for excess/ego gratification. yuk

no matter their ceaseless efforts to block the truth from you, the tasks (planet/population rescue) will be completed.

the lights are coming up now.

you can pretend all you want. our advise is to be as far away from the walking dead contingent as possible, when the big flash occurs. you wouldn't want to get any of that evile on you.

as to the free unlimited energy plan, as the lights come up, more&more folks will stop being misled into sucking up more&more of the infant killing barrolls of crudeness, & learn that it's more than ok to use newclear power generated by natural (hydro, solar, etc...) methods. of course more information about not wasting anything/behaving less frivolously is bound to show up, here&there.

cyphering how many babies it costs for a barroll of crudeness, we've decided to cut back, a lot, on wasteful things like giving monIE to felons, to help them destroy the planet/population.

no matter. the #1 task is planet/population rescue. the lights are coming up. we're in crisis mode. you can help.

the unlimited power (such as has never been seen before) is freely available to all, with the possible exception of the aforementioned walking dead.

consult with/trust in yOUR creator. more breathing. vote with yOUR wallet. seek others of non-aggressive intentions/behaviours. that's the spirit, moving you.

pay no heed/monIE to the greed/fear based walking dead.

each harmed innocent carries with it a bad toll. it will be repaid by you/us. the Godless felons will not be available to make reparations.

pay attention. that's definitely affordable, plus, collectively, you might develop skills which could prevent you from being misled any further by phonIE ?pr? ?firm? generated misinformation.

good work so far. there's still much to be done. see you there. tell 'em robbIE.

as has been noted before, lookout bullow.

mynuts wonce again? (0)

Anonymous Coward | more than 10 years ago | (#6749063)

can't say you're not pessimistic.

If anyone would know... (1)

PunXX0r (694958) | more than 10 years ago | (#6749042)

If there is a group on earth that would have some idea about what the next stage technology that will upset Moore's law will be, it's DARPA. It is possible that they already are 10-15 years advanced of that which we get down here on earth (tin foil hat time), and are just planning upon declassifying it as it becomes cost-effective (read: profitable) to do so. Heh Heh.

[Funny Slashdot Comment] How about... (0, Flamebait)

cerebralsugar (203167) | more than 10 years ago | (#6749044)

Chips foundrys, that instead of foundering chips the traditional way, they Garflag Barg Butto Moogie KawwwooowwWwweeee!!!!

Sorry, you probably can't read that last part. I had to encrypt it as I don't own a patent on it yet. And if you Slashdotters try to break my encryption I will be forced to shoot you under the digital millenium copyright act. Don't worry though, you can pay me licensing fees for linux too. Don't pay those people at SCO, my license is better!

By the way, I own the intellectual property of "modding down", be it the "troll" variety or the "off topic" variety, so for each such mod recieved I will charge you a (very reasonable) licensing fee of only $799.

Re:[Funny Slashdot Comment] How about... (1, Funny)

cerebralsugar (203167) | more than 10 years ago | (#6749072)

Flamebait, I forgot that one...

Re:[Funny Slashdot Comment] How about... (1)

Lane.exe (672783) | more than 10 years ago | (#6749076)

That's not funny. And neither are you.

Re:[Funny Slashdot Comment] How about... (-1, Offtopic)

cerebralsugar (203167) | more than 10 years ago | (#6749090)

Hey, is that a girl on your MP3.com page? Or do you just look like one?

It's OK, we all know the vagina rules the world..

WRONG! (1, Funny)

Anonymous Coward | more than 10 years ago | (#6749060)


"DARPA (the folks who brought you the Internet)


WRONG!

It was algore who brought us the internet!

Where have you been?

bunghole (-1, Troll)

Anonymous Coward | more than 10 years ago | (#6749065)

gnaa reccomends anuses cheeses

human terms (0, Offtopic)

omarques (685690) | more than 10 years ago | (#6749066)

"Is it a tank, is it a missile, is it a school bus?"
I Can't Believe It's Yogurt!
Hold fire!!! Oops, too late!
Sorry, kid! Bad time for holding this cup... looked like a grenade or someting...

[bang head here] (1)

bersl2 (689221) | more than 10 years ago | (#6749073)

Is it too much to ask for everyone to just STFU about Moore's Law? Can't we just let it go quietly? I mean, aren't there more important things than sitting around arguing the relevance (if any) of Moore's Law.

Then again, I post on /.

Re:[bang head here] (0)

Anonymous Coward | more than 10 years ago | (#6749095)

Damn, forgot to preview...

They might violate Moore's Law... (0)

Anonymous Coward | more than 10 years ago | (#6749075)

...but they will never break Brannigan's Law! Branigan's law is like Branigan's love...hard and fast!

Moore's law is already ending (3, Informative)

Junks Jerzey (54586) | more than 10 years ago | (#6749082)

Moore's law is already ending. Intel's Prescott (i.e. Pentium 5) CPU dissipates 103 watts. That's beyond anything you can put in a laptop, and it's arguably beyond anything that should be in a workstation-class PC. But it also may not be that we're hitting CPU speed limits, just that we're hitting the limits of type types of processors that are being designed. Much of the reason the PowerPC line runs cooler than the x86 is because the instruction set and architecture are much cleaner. There's no dealing with calls to unaligned subroutines, no translation of CISC instructions to a series of RISC micro-ops, and so on. But there are the same fundamental issues: massive amounts of complexity dealing with out of order execution, register renaming, cache management, branch prediction, managing in-order writebacks of results, etc.

Historically, designing CPUs for higher-level purposes, other than simply designing them to execute traditional assembly language, has been deemed a failure. This is because generic hardware advanced so quickly that the custom processors were outdated as soon as they were finished. Witness Wirth's Lilith, which was soon outperformed by an off-the-shelf 32-bit CPU from National Semiconductor (remember them?). The Lisp machine is a higher profile example.

But now things are not so clear. Ericsson designed a processor to run their Erlang concurrent-functional programming language, a language they use to develop high-end, high-availability applications. The FPGA prototype was outperforming the highly-optimized emulator that had been using up to that point by a factor of 30. This was with the FPGA at a clock speed of ~20MHz, and the emulator running on an UltraSPARC at ~500MHz. And remember, this was with an FPGA prototype, one that didn't even include branch prediction. Power dissipation was on the order of a watt or two.

Quite likely, we're going to start seeing more of this approach. Figure out what it is that you actually want to *do*, then design for that. Don't design for an overly general case. For example, 90% of desktop CPU use could get by without floating point math, especially if there were some key fixed point instructions in the integer unit. But every Pentium 4 and Athlon not only includes 80-bit floating point units, but massive FP vector processing units as well. (Not to mention outmoded MMX instructions that are almost completely ignored.)

THAT MAKES NO SENSE AND YOU'RE A TURD-LICKING HOMO (-1)

Subject Line Troll (581198) | more than 10 years ago | (#6749144)

morons WANdering whois really mynuts won? (-1, Troll)

Anonymous Coward | more than 10 years ago | (#6749122)

seems like there may have been a MiScouNT here?

I'm glad that the majority of posters... (Score:5, Insightful)

by bmajik (96670) [slashdot.org] <matt@mattevans.org [mailto] >
on Wednesday August 20, @05:07PM (#6748536 [slashdot.org] )

(http://www.mattevans.org/ [mattevans.org] | Last Journal: Wednesday October 30, @09:12PM [slashdot.org] )

are seeing this for what it is: "No big deal"

This
is NOT big brother. This is about building valuable meta information
on top of usenet. Why ? Because one of the things MS heard long ago
is that people liked linux because they could go to a newsgroup and get
help with it, often from the people that wrote the component in
question ? What did MS do ? They responded - MS employees now monitor
the microsoft.public news groups. We respond to posts, try and solve
problems for people, answer questions, debug code, etc etc. I myself
can be found occasionally posting in the Visual Basic newsgruops (where
we have lots and lots of non-full-time or beginning programmers that
really need just a little bit of help to get them going).

The
people that _write_ the VB compiler are now monitoring VB newsgroups to
try and help connect with real customers and to really understand how
people use and dislike MS products.

Managing and making sense
out of the whole mess that is usenet is a nightmare, and MS Research is
doing some good work in this area. MS has some internal software that
treats usenet posts as "issues" and determines if they've been resolved
or not, if they need followup, etc etc. One interesting thing we've
found is taht there are many issues resolved by "the community", i.e.
non-MS employees that are subject matter experts. I don't know the
details on this but I think we make an effort to track who is and isn't
a great contributor and maybe they get some sort of compensation or
recognition or something.. like i said i don't know the details of that
at all..

In any case, the point of this usenet data mining is
to try and analyze the incredibly huge sea of usenet. We want to
figure out what kinds of problems people have, what people are causing
noise, what people are really helping other, etc etc. There is no
nefarious invasion of privacy here, the only thing that is analyzable
is what people explicitly post to a public forum...

Look at my
userid - i was a slashdot reader long before i work where i currently
do. Back then, the MS bashing and second guessing definitely took
place, and i even participated. I'm still a slashdot reader but I do
get awfully tired of the sheer volume and irrationality of negative-MS
stuff that happens here.

When I started at MS, I found out
awfully fast that many of my arguments against MS were speculative, but
mostly it was me being factually wrong and talking out of my ass. I
remember in my original interviews i was trying to lecture an NT
developer about how putting GDI in kernel for NT4 was stupid because it
would lead to crashes. How pompous of me! It was something I read on
some stupid website or industry rag. Later I found out (from reading
Inside W2k -- excellent book) that it was irrelevant because if the
session manager sees that the GDI user-land process exits /crashes for some reason, it reboots the box anyhow, i.e. a problem with GDI reboots the box either way.

So
after 8+ years of hating MS and talking out of my ass, followed by 3+
years of working at MS and realizing how much i was talking out of my
ass, I'm doing two things:

1) talking out of my ass less
2) telling others that are clearly talking out of their ass that they are doing so, so that they can
2a) stop spreading misinformation
2b)
have their eyes opened that nobody is impressed by their incorrect
speculations and their emotional campaigns of disinformation

I
know im not preaching to a sympathetic audience here, but honestly, the
speculation, questions, etc people have about MS could be answered
truthfully and honestly if some of you would bother to ask, or do some
research. But unfortuneately i know all to well (because i used to do
it) that its easier, and certainly more fun, to beleive everything you
_want_ to beleive about MS that bolsters your own predetermined
mindset. If, for example, you find yourself referring to an article
that The Register wrote, please stop and ask yourself what the hell the
register knows

Read the rest of this comment... [slashdot.org]




[ Reply to This [slashdot.org]
]

I'm looking forward to... (2, Funny)

Linker3000 (626634) | more than 10 years ago | (#6749152)

The era of biological computing when I can just sneeze on my PC to double its RAM!

Another great DARPA product! (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#6749155)

Self-healing landmines!!
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>