Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Death of the Silicon Computer Chip

kdawson posted more than 6 years ago | from the aye-and-it-had-a-good-run dept.

Intel 150

Stony Stevenson sends a report from the Institute of Physics' Condensed Matter and Material Physics conference, where researchers predicted that the reign of the silicon chip is nearly over. Nanotubes and superconductors are leading candidates for a replacement; they don't mention graphene. "...the conventional silicon chip has no longer than four years left to run... [R]esearchers speculate that the silicon chip will be unable to sustain the same pace of increase in computing power and speed as it has in previous years. Just as Gordon Moore predicted in 2005, physical limitations of the miniaturized electronic devices of today will eventually lead to silicon chips that are saturated with transistors and incapable of holding any more digital information. The challenge now lies in finding alternative components that may pave the way to faster, more powerful computers of the future"

cancel ×

150 comments

Sorry! There are no comments related to the filter you selected.

I'll... (5, Insightful)

PachmanP (881352) | more than 6 years ago | (#22892956)

...believe it when I see it!

Re:I'll... (4, Insightful)

scubamage (727538) | more than 6 years ago | (#22892990)

I agree. We have the methods to use other material, but silicon is plentiful and VERY cheap. Like, the majority of the earth's composition cheap. Grab a handful of dirt ANYWHERE and a large portion will be silicon. Even if it gets replaced for certain high end hardware, I doubt silicon will be going anywhere anytime soon - its simply too affordable.

Re:I'll... (5, Insightful)

CRCulver (715279) | more than 6 years ago | (#22893076)

It's not as if carbon is scarce either.

Re:I'll... (3, Insightful)

gyranthir (995837) | more than 6 years ago | (#22893408)

The issue of Carbon is the cost, scalability, accuracy, and timeliness/speed of nanotube production. Not the resource itself.

Re:I'll... (4, Insightful)

twistedsymphony (956982) | more than 6 years ago | (#22893576)

The issue of Carbon is the cost, scalability, accuracy, and timeliness/speed of nanotube production. Not the resource itself.
What's that quote? "Necessity is the mother of Invention." or something along those lines.

Silicone was expensive to refine and manufacture at one point too. Like all new technologies the REAL cost is the in manufacturing and the cost goes down once we've manufactured enough of it to refine the process until we know the cheapest and quickest ways to do it.

Re:I'll... (3, Insightful)

gyranthir (995837) | more than 6 years ago | (#22893678)

That may be true, but that isn't going to change in 4 years. The replacement ideas have been around for a good while now and still productions, repetition, and scalability are still very not cost effective or scalable to even minimal production needs. And not to nitpick it's Silicon, not cone.

Re:I'll... (2, Insightful)

Anonymous Coward | more than 6 years ago | (#22893930)

Nitpicking? I hope you're being sarcastic. :) If you had pointed out grammatical errors, that might be nitpicking. But a person is pretty far into "STFU" territory when they presume to say something meaningful about the element silicon and confuse it with silicone.

Re:I'll... (5, Interesting)

Beetle B. (516615) | more than 6 years ago | (#22893830)

Like all new technologies the REAL cost is the in manufacturing and the cost goes down once we've manufactured enough of it to refine the process until we know the cheapest and quickest ways to do it.
Cost is not the main problem with nanotubes.

Nanotubes have a certain chirality - denoted by (m,n) with m and n being integers. Those two numbers define the properties of the nanotube (e.g. if m-n is a multiple of 3, the nanotube is metallic - otherwise it is semiconducting). They also determine the radius.

So far no one has come up with a way to get a nanotube of a certain chirality. They just synthesize many nanotubes and then pick manually the ones they want - if it exists in the sample. Until they can do this, the nanotube industry will not become a reality.

Re:I'll... (1)

BlueParrot (965239) | more than 6 years ago | (#22894888)

I dunno, presumably the tubes with a different chirality has uses for other applications, so could you not just sort the lot rather than throwing away a bunch of them ?

Re:I'll... (2, Interesting)

Beetle B. (516615) | more than 6 years ago | (#22895026)

The issue is not so much that some are being "wasted". The problem is selecting the ones you want. How do you automate that? You have a process that gives you lots and lots of nanotubes. How do you automatically filter out the ones you want? That's been the problem since day 1, and has not been resolved.

Re:I'll... (2, Interesting)

camperdave (969942) | more than 6 years ago | (#22895304)

My admittedly limited understanding of carbon nanotubes is that they are self producing. What I mean by that is that if one stable tube diameter is 20 atoms, and another stable tube diameter is 30 atoms, then the 20 atom tube is going to continue to grow as a 20 atom tube. It won't spontaneously widen out to a 30 atom tube. If that is the case, then all you would need is a few seed nanotubes, and the right conditions.

Re:I'll... (1)

cyfer2000 (548592) | more than 6 years ago | (#22894912)

Or selectively remove (etching) nanobute with certain chirality.

Pet Peeve (2, Informative)

pleappleappleap (1182301) | more than 6 years ago | (#22894098)

Silicone != Silicon

Re:Pet Peeve (1)

Comboman (895500) | more than 6 years ago | (#22894432)

Silicone != Silicon

Unless you're a female robot with a boob job.

Re:I'll... (1)

gyranthir (995837) | more than 6 years ago | (#22894394)

Another quote comes to mind: "If it ain't broke, don't fix it."
So far, Silicon is not broke, and is currently scaling pretty well, especially with advances in production, and additions, and compositions.
Examples: Gallium-arsenide, Silicon-carbide
http://en.wikipedia.org/wiki/Wafer_(electronics) [wikipedia.org]

Re:I'll... (2, Insightful)

maxume (22995) | more than 6 years ago | (#22893758)

Those are all issues with silicon as well(crystals vs nanotubes...), they are just reasonably well solved.

Re:I'll... (1)

gyranthir (995837) | more than 6 years ago | (#22894040)

We started learning how to refine, purify Silicon around 1915. We started using Silicon in Integrated Circuits, Silicon Wafers, etc, around 1960. That's 48 years of mass production, with many iterations and refinements along the way. We started learning about Carbon Nanotubes in the 1970s, and finally in 1993 started being able to produce them in any quantifiable measure. The production process that we have today is nowhere near the scale of Silicon in the 1960s. The CVD process we currently use to produce carbon nanotubes is slow, and not terribly cost effective, even with the recent advances in technology. We are just in beginning stages of this process and 4 years is a terribly short sighted estimate of time to replace silicon as the IC of choice.

Re:I'll... (1)

Magada (741361) | more than 6 years ago | (#22895008)

Mass production? Who used silica crystals in mass-production quantities in 1916 and for what? Your alternate history timeline interests me and I wish to subscribe to your newsletter.
Also, crystal silicon was first obtained in 1854.

Re:I'll... (4, Informative)

gyranthir (995837) | more than 6 years ago | (#22895176)

http://en.wikipedia.org/wiki/Silicon [wikipedia.org]

No one said anything about mass production in 1916, read the post again.

We started learning to purify it in the 1910's.

From Wikipedia:
The earliest method of silicon purification, first described in 1919 and used on a limited basis to make radar components during World War II, involved crushing metallurgical grade silicon and then partially dissolving the silicon powder in an acid. When crushed, the silicon cracked so that the weaker impurity-rich regions were on the outside of the resulting grains of silicon. As a result, the impurity-rich silicon was the first to be dissolved when treated with acid, leaving behind a more pure product.

From: http://en.wikipedia.org/wiki/Integrated_circuit [wikipedia.org]
The first integrated circuits were manufactured independently by two scientists: Jack Kilby of Texas Instruments filed a patent for a "Solid Circuit" made of germanium on February 6, 1959. Kilby received several US patents.[4][5][6] Robert Noyce of Fairchild Semiconductor was awarded a patent for a more complex "unitary circuit" made of Silicon on April 25, 1961. (See the Chip that Jack built for more information.)

The first integrated circuits contained only a few transistors. Called "Small-Scale Integration" (SSI), they used circuits containing transistors numbering in the tens.

SSI circuits were crucial to early aerospace projects, and vice-versa. Both the Minuteman missile and Apollo program needed lightweight digital computers for their inertial guidance systems; the Apollo guidance computer led and motivated the integrated-circuit technology, while the Minuteman missile forced it into mass-production. These programs purchased almost all of the available integrated circuits from 1960 through 1963, and almost alone provided the demand that funded the production improvements to get the production costs from $1000/circuit (in 1960 dollars) to merely $25/circuit (in 1963 dollars).[citation needed] They began to appear in consumer products at the turn of the decade, a typical application being FM inter-carrier sound processing in television receivers.

The next step in the development of integrated circuits, taken in the late 1960s, introduced devices which contained hundreds of transistors on each chip, called "Medium-Scale Integration" (MSI).

They were attractive economically because while they cost little more to produce than SSI devices, they allowed more complex systems to be produced using smaller circuit boards, less assembly work (because of fewer separate components), and a number of other advantages.

Further development, driven by the same economic factors, led to "Large-Scale Integration" (LSI) in the mid 1970s, with tens of thousands of transistors per chip.

Integrated circuits such as 1K-bit RAMs, calculator chips, and the first microprocessors, that began to be manufactured in moderate quantities in the early 1970s, had under 4000 transistors. True LSI circuits, approaching 10000 transistors, began to be produced around 1974, for computer main memories and second-generation microprocessors.

Re:I'll... (4, Funny)

somersault (912633) | more than 6 years ago | (#22893090)

I always wondered why those implants felt like a bag of sand..

Re:I'll... (4, Insightful)

ScrewMaster (602015) | more than 6 years ago | (#22893106)

I doubt silicon will be going anywhere anytime soon - its simply too affordable.

Yes, and we're so damned good at manipulating it. All this newfangled stuff is pie-in-the-sky at this point. Yes, I suppose we'll eventually replace it for the likes of high-end processors, as you say, but everything else out of silicon for a long time to come.

People keep bring up Moore's Law, as if it's some immutable law of physics. The reality is that we've invested trillions of {insert favorite monetary unit here} in silicon-based tech. Each new generation of high-speed silicon costs more, so that's a lot of inertia. Furthermore, if Guilder's Rule holds true in this case (and I see no reason why it shouldn't) any technology that comes long to replace silicon will have to be substantially better. Otherwise, the costs of switching won't make it economically viable.

Re:I'll... (3, Informative)

geekoid (135745) | more than 6 years ago | (#22893228)

replace 'better' with 'more value'.

For example:
If it costs 1/100 the price but seen no end users gains in 'speed' and/or 'power' it could replace silicone. It's not better at doing anything, it just has a higher value.

"All this newfangled stuff is pie-in-the-sky at this point."

hmmm, some of this is a lot farther along then pie in the sky.

Most people on /. don't even seem to understand Moore's law and think it has to do with speed and power;which is doesn't those are artifacts of the law.

Finally:
The real problem with silicone is the fabs. They are running into some serious problems at these incredibly small sizes. Some fabs are problems with metal atoms in the air, atom that are below detection and the ability to remove.

I am not dooming and glooming silicone here(although there are some advantages to hitting a minimum size) it's just that some problems aren't going away and are getting harder to deal with and the past work a rounds aren't cutting it.

Re:I'll... (0)

Anonymous Coward | more than 6 years ago | (#22893274)

You post would be more insightful if you knew the difference between "silicon" and "silicone".

Re:I'll... (4, Interesting)

scubamage (727538) | more than 6 years ago | (#22893456)

You make some good points and I can't really argue them. As the die sizes continue to get smaller, silicon wafers must be more and more pure because tinier artifacts in the wafer can cause issues in the manufacturing process and thats going to be pretty unavoidable. However it also means that more dies can be stamped onto each wafer which should negate the number that are lost. I was meaning more that even if computer hardware is replaced with something else, things which need lower grade integrated circuits are still going to use silicon. I mean, you don't need a 1thz processor for a car's ECU, or for a garage door opener. And as more and more appliances become "smart" more things are going to need lower end chips - so I highly doubt that silicon is going anywhere. Maybe not for pc's, but everything else that is just starting to get 'wired' silicon is going to be around for a VERY long time.

Re:I'll... (1, Insightful)

suggsjc (726146) | more than 6 years ago | (#22894512)

I mean, you don't need a 1thz processor for a car's ECU, or for a garage door opener.

absolutely positively undeniably 100% wrong

Just because your garage door opener can't "solve" Folding@Home doesn't mean that we can't dream. I mean, at some point we truly need to be able to say something like "well my garage door opener has more processing power than BlueGene/L did in 2008"

Seriously, get over yourself and your "reality"

Re:I'll... (2, Funny)

scubamage (727538) | more than 6 years ago | (#22894612)

absolutely positively undeniably 100% wrong
I deny your reality and substitute my own ;)

Re:I'll... (2, Funny)

ColdWetDog (752185) | more than 6 years ago | (#22894658)

I mean, you don't need a 1thz processor for a car's ECU, or for a garage door opener.

absolutely positively undeniably 100% wrong

sometime in the 1Thz-Garage-Door-Opener-Overlord-future:

GARAGE_OWNER: "Open the garage door please, Hal"

GARAGE_DOOR: "I'm sorry Dave, I can't do that.

You're saying that you want this sort of thing to happen? No thanks. I like my appliances simple and mute, thankyouverymuch.

Re:I'll... (1)

seven of five (578993) | more than 6 years ago | (#22894672)

well my garage door opener has more processing power than BlueGene/L did in 2008"

"Open garage door, please, HAL."
"I'm sorry, Dave, I can't do that."
(pause)
"Why not?"
"I think you know the answer to that question."

Re:I'll... (1)

suggsjc (726146) | more than 6 years ago | (#22894680)

Flamebait, really?
Guess people's sarcasm detectors aren't working.

Re:I'll... (4, Insightful)

petermgreen (876956) | more than 6 years ago | (#22893196)

I'm pretty sure the cost of the raw material is a negliable part of the costs of making semiconductor grade silicon. Most of the costs are in the very energy intensive purfification processes.

The real advantage of silicon for many years was that SiO2 was/is a decent gate materal for mosfets and insulator for insulating the metal from the main body of the IC and could be grown easilly on the surface of silicon. But afaict this advantage has dwindled as we need CVD deposited insulators for insulating between multiple metal layers anyway and as processes have got smaller there is a push to switch to other gate materials for better performance.

The main advantage of silicon right now is probablly just that we are very used to it and know what does and doesn't work with it. Other semiconductors are more of an unknown.

Even if silicon gets displaced from things like the desktop/server CPU market though I suspect it will stick arround in lower performance chips.

Re:I'll... (1, Interesting)

Anonymous Coward | more than 6 years ago | (#22894554)

Why does it have to be smaller all the time? Would it be that bad to double the size of the wafer to add twice the transistors at this point?

Re:I'll... (5, Interesting)

iamhassi (659463) | more than 6 years ago | (#22893340)

"I doubt silicon will be going anywhere anytime soon - its simply too affordable."

Agreed. Besides, they've been saying this since the 90s, that silicon can't possibly get any faster and it'll be replaced very soon.

I call BS. They had 350 gigahertz silicon chips 2 years ago [news.com] :
"At room temperature, the IBM-Georgia Tech chip operates at 350GHz, or 350 billion cycles per second. That's far faster than standard PC processors today, which range from 3.8GHz to 1.8GHz. But SiGe chips can gain additional performance in colder temperatures....SiGe chips, the scientists theorized, could eventually hit 1 terahertz, or 1 trillion cycles a second."

I think silicon is safe for awhile longer.

Re:I'll... (2, Interesting)

smackt4rd (950154) | more than 6 years ago | (#22894280)

That 350GHz chip is probably much simpler and easier to build than a CPU, but the fact remains that it'd be incredibly difficult to just try and switch from Si to some other semiconductor and be able to build something cheap. We're already starting to see the manufacturers switch their architecture to multi-core CPU's. I think that's alot more practical than trying to switch to an exotic material.

Re:I'll... (1)

Beetle B. (516615) | more than 6 years ago | (#22894464)

I call BS. They had 350 gigahertz silicon chips 2 years ago:
Yes, that's a record for silicon based devices, as you mentioned.

However, the record for fastest transistor has been held by III-V based transistors (i.e. not silicon) for a few years now. See this [sciencedaily.com] , for example.

So the article's not all that wrong.

Silicon most common? No. (2, Funny)

pleappleappleap (1182301) | more than 6 years ago | (#22894076)

I know I'm being a little pedantic, but Silicon is NOT the most common element in the Earth. It is the most common element in the CRUST of the Earth. The most common element of the Earth is Iron. The Earth is an impure ball of iron oxides.

Re:I'll... (3, Funny)

Zaatxe (939368) | more than 6 years ago | (#22894854)

Grab a handful of tit ANYWHERE and a large portion will be silicone.

There, I fixed that for you.

Re:I'll... (1)

sm62704 (957197) | more than 6 years ago | (#22893460)

Silicon is dying! I knew it! The vaccuum tube ius making a comeback!

Wait a minute, the monitor I'm staring at is a vaccuum tube. They told me vaccuum tubes were gone a couple of decades ago and they're still in guitar amps, too.

I predict that this prediction about the demise of silicon is as accurate as their predictions about the demise of vaccuum tubes. But in four years nobody's going to remember their prediction, or mine either.

Re:I'll... (1)

Sponge Bath (413667) | more than 6 years ago | (#22893594)

...the monitor I'm staring at is a vaccuum tube

Wow grandpa, your face is so tan!

Re:I'll... (1)

sm62704 (957197) | more than 6 years ago | (#22894056)

I'm at work. But you know, my monitor at home is a tube, too. So is my TV screen, a forty two inch trinitron, 214 pounds. Has the added benefit of being hard to steal.

OT but I'm curious, where did your username come from?

Re:I'll... (2, Insightful)

maddriller (1148483) | more than 6 years ago | (#22893810)

It is sort of silly to declare the end of life for one technology when the technology to replace it is not yet in place. Every year for the last twenty people have proclaimed the end of silicon's reign, yet we still use silicon. The is a huge investment in the existing silicon infrastructure that will have to be duplicated in any replacement technology. There is also the educational inertia - engineering schools are still teaching people to use silicon and it will be many years before they start teaching anything else. Silicon will be around for a long time to come.

Re:I'll... (1)

cyfer2000 (548592) | more than 6 years ago | (#22894988)

I think the paper only said conventional silicon chip will go away. There are many nonconventional silicon technology like finFET, silicon nanorod and etc.

Re:I'll... (1)

bkr1_2k (237627) | more than 6 years ago | (#22895198)

Yeah, no doubt. Silicon may or may not be the industry standard in 10 years, but saying it only has 4 more years of life is ridiculous to say the least. We're still using 200 MHz processors, for Christ's sake. The difference is now we're using them smaller, and in more "consumable" resources, rather than as our primary machines. Silicon has at least 2 generations (human generations) of life before we see it truly dead.

Let them speculate ... (5, Insightful)

ScrewMaster (602015) | more than 6 years ago | (#22893006)

[R]esearchers speculate that the silicon chip will be unable to sustain the same pace of increase in computing power and speed as it has in previous years.

In the meantime, other researchers will figure out ways to make silicon work smarter, not harder.

Re:Let them speculate ... (3, Funny)

Himring (646324) | more than 6 years ago | (#22893194)

They can have my silicon chip when they pry it from my cold, dead, motherboard....

Re:Let them speculate ... (1)

Bloodoflethe (1058166) | more than 6 years ago | (#22894620)

"That's the idea..."
*blaster goes off*



Han shot first.

Re:Let them speculate ... (1)

zappepcs (820751) | more than 6 years ago | (#22893294)

Absolutely. I don't think that code for multicore cpus is fully baked yet nor near the end of what can be done with it. FPGAs are improving and we still have not seen that component type hit saturation yet. We are nowhere near done doing all that can be done with the silicon we already have never mind what is coming down the pipe in the next few years.

It's hype, nothing but.

I'd like to see something that is vastly better, cheaper, more energy efficient, and capable of greater performance... but until that comes along silicon is going nowhere.

Re:Let them speculate ... (1)

billcopc (196330) | more than 6 years ago | (#22894806)

There's always someone yapping about FPGA, but the truth is I have yet to see an FPGA used in a production environment. They're great for prototyping, but how about real-time reconfigurable devices outside the lab ? Surely there must be tasks where such a thing would be beneficial, as an alternative to GPGPU-type stuff.

Re:Let them speculate ... (0)

Anonymous Coward | more than 6 years ago | (#22893406)

I'm sure you say this because, like so many Slashdot users, you have an intimately detailed knowledge of and involvement in cutting-edge semiconductor research... unlike, say, the attendees of the Institute of Physics' Condensed Matter and Material Physics conference, who are widely known for talking our of their collective wazoo.

Re:Let them speculate ... (1)

ScrewMaster (602015) | more than 6 years ago | (#22894644)

No need to be snide.

For the past thirty years I've been hearing about the "ultimate limits" of silicon-based technology, and the various advancements using {insert favorite exotic material here} that were going to supplant it. I have some history on my side when I say that predictions of silicon's demise have always proven premature, and I'm not convinced that this time is any different. If I'm wrong, great ... we'll all have terahertz processors in our Playstations. But I'm not holding my breath.

Didn't slashdot already cover it? (1)

poetmatt (793785) | more than 6 years ago | (#22893016)

Didn't we essentially already talk about a processor replacement with Graphene? [slashdot.org] It wasn't that long ago that such a thing was posted....although I don't know anything about it from a truly technical standpoint whether that is viable or not.

Not again (5, Informative)

Maury Markowitz (452832) | more than 6 years ago | (#22893034)

I've been hearing this claim every few years for the last 25. Remember optical computers in the mid-80s? How about gallium arsenide? CRAY-3 anyone?

And of course what's really reaching a limit is not the CPU's, but our ability to use them effectively. See "TRIPS architecture" on the wiki as an example end-run around the problem that offers hundred-times improvements using existing fabs.

Maury

Re:Not again (2, Insightful)

esocid (946821) | more than 6 years ago | (#22893248)

Yeah, agree with you there. The article said they will be replaced within 4 years...yeah right. Maybe in 10 years something will come out that may be faster, but marginally more expensive. I don't see silicon exiting the technology world altogether within even the next 50 years. Some parts may be replaced but Si chips will still be kicking.

Re:Not again (1)

geekoid (135745) | more than 6 years ago | (#22893382)

haha, TRIPS.

So this other technology that claims it will be going by 2012, it's not going to happen, but this other technology that claims it will be giong by 2012 is a show in!

Sorry, you will nede more then that. All these slashdot articles remind me of when tubes went away*, the same arguments.

*Yes, I KNOW there are devices that use tubes, seriously.When was the last time you saw a tube tester in a grocery store?

Re:Not again (1)

exultavit (988075) | more than 6 years ago | (#22893550)

And of course what's really reaching a limit is not the CPU's, but our ability to use them effectively.
I think there are about three issues being confused here. One is progress in fabrication techniques. Another is the semiconductor physics at small scales. The third is systems architecture.

I agree that this last problem is currently the most immediately pressing one. But that doesn't mean that the first two won't also be major problems in the near future, as we are approaching the limits of our current techniques.

Re:Not again (1)

Maury Markowitz (452832) | more than 6 years ago | (#22895520)

> as we are approaching the limits of our current techniques

For sure... after many decades we _are_ approaching real physical limits. But I guess for me the real question is "so what?" Or to put it another way, "was silicon ever the problem?" My own computer is way faster than the rest of the computer it's attached to, and the bottlenecks are almost always either HD or GPU related.

If we really can extract vastly more performance though architecture changes, then at some point you have to do a price/performance comparison and decide where to move. The market seems to be _extremely_ effective at this. I'm watching these developments with interest (and various new memory technologies as well).

Let's not forget DARPA's hand in all of this, historically at least. Everyone remembers their efforts that led to the internet, but does anyone remember the VLSI Project? It was arguably as important to the current state of computing as anything else. I was rather saddened to see recent reports of their efforts being re-focused to short-term development of "weapons"-ish topics.

Maury

Re:Not again (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22893726)

See "TRIPS architecture" on the wiki

Sorry, can't find it on the wiki. [c2.com] ... wait, do you mean Wikipedia? There are other wiki's besides Wikipedia, you know, and a fair number of them started well before Wikipedia was even a twinkle in Jimbo's eye. If anything deserves the label "the wiki", it's the first one - Ward's Wiki [c2.com] a.k.a. the Wiki Wiki Web a.k.a. the Portland Pattern Repository.

Re:Not again (1)

Beetle B. (516615) | more than 6 years ago | (#22894502)

I've been hearing this claim every few years for the last 25. Remember optical computers in the mid-80s? How about gallium arsenide? CRAY-3 anyone?
While GaAs has not replaced silicon for computer CPU's, it has some applications (non-optical ones) that Si cannot compete with. Cell phones, for example.

gaas - a little nostalgia. (2, Informative)

flaming-opus (8186) | more than 6 years ago | (#22894662)

gallium arsenide was a reasonable technology to pursue at the time. It had teething problems, was expensive to manufacture, and ccc ran into funding problems related to a drop off in defense spending after the end of the cold war. That is not to say that Gaas was a completely foolish technology for the time. There are many reasons to believe that it offered faster switching times, and smaller module packages than did ECL logic of the time. CCC was putting out a 500mhz machine in the early 90's, four years before ECL machines hit that speed, and six years before cmos could.

Of course, wire delays started to become a concern for multi-board processors, and cmos began to deliver enough transistors on a package that out-of-order superpipelining became possible, and the performance advantage of a slightly higher clocked ecl/gaas processor evaporated. This is not to say that there was not a good six-seven year window of opportunity for gallium arsenide, while cmos was still pretty feeble. I'll also point out that gaas has continued to be used in specilized applications like serdes, high-speed signal-drivers, and cell-communications drivers. You're never going to get millions of mesfets on a chip, but they work really well, if you need a few dozen really fast drivers.

As for trips, and a lot of other designs like it, it essentially is working on the problem that modern cmos introduced. We have more transistors than we know what to do with, but we can't drive them any faster. I've seen some clever designs that are very good at solving one type of problem. I have yet to see a design that solves the problem in the general case, and with minimal change in the programming model. A lot of smart people are working on the problem, however, so I suspect that something will come about; It may not happen quickly, however.

Next phase: Transistors in Silicone (1)

LiquidCoooled (634315) | more than 6 years ago | (#22893036)

Well, we use most silicon to display boobs, might as well repay the favour :)

Overclocking might be fun as well: "Hey, I managed a stable DD at room temperature!"

Re:Next phase: Transistors in Silicone (1)

ScrewMaster (602015) | more than 6 years ago | (#22893840)

Well, we use most silicon to display boobs, might as well repay the favour :)

Overclocking might be fun as well: "Hey, I managed a stable DD at room temperature!"


In that context, you probably should have said, "overcocking". I know I get enough emails on that subject every day.

Again? (1)

notabaggins (1099403) | more than 6 years ago | (#22893040)

I remember reading about the death of silicon in the 1970s...

(Okay, dating myself here, but still...)

Re:Again? (1)

R3N3G4D3 (1227590) | more than 6 years ago | (#22893108)

I also doubt the fact that in 4 years we will be switching over. Nanotubes are still too expensive and hard to produce. Both Intel and ARM (cellphone chip maker) already have plans for 20nm technology which might be with us for another decade or so. If anything, in 4 years we will probably start seeing nanotube chips, but they will be an overpriced item for shoppers with deep pockets like solid state drives have been for the last few years until they started dropping in price.

i for one (0)

ionix5891 (1228718) | more than 6 years ago | (#22893070)

welcome our new FASTER graphene overlords

yeah right (-1, Flamebait)

Anonymous Coward | more than 6 years ago | (#22893094)

1) The guy's name is Stony. I don't get my information from people with kid's names.
2) The "conventional" silicon chip has re-defined itself many times over the years. We use silicon-germanium chips now.

Much sillio articulo (5, Insightful)

Ancient_Hacker (751168) | more than 6 years ago | (#22893126)

Let's think, a technology that has taken 60 years to go from lab to today's level, it's going to be superseded in five years by technology that has not yet made a single transistor or gate. Hmmmm..... Meanwhile silicon is not going to be improved in any obvious way, such as with ballistic-transistors, gallium-arsenide, silicon-carbide, 3-d geometries, process shrinkage, etc, etc, etc, etc, etc, etc.... No soup for you.

Quick.... (1)

parseexception (516727) | more than 6 years ago | (#22893130)

we must find a way to pay our salaries and secure our jobs through the recession, I know get out the ol "Silcon is Dead" article and dust it off, that should last us 4 years

A little pre-mature don't you think ?! (1)

giorgist (1208992) | more than 6 years ago | (#22893140)

How about cars are dead, over cars are on their way
G

REplace with??? (0)

Anonymous Coward | more than 6 years ago | (#22893184)

Right so the article goes to say silicon only has 4years left

YET then go on to say they just need to find something to replace it

WTF!!!!!
Thats like saying that petrol's days are numbered we just need to find something to replace it

Unlikely (5, Informative)

aneviltrend (1153431) | more than 6 years ago | (#22893202)

Intel's CTO Justin Rattner just gave a talk at Cornell two days ago; he covered this topic carefully and confirmed that Intel has the technology and plans to carry out Moore's Law for another 10 years on silicon. Technologies such as SOI [wikipedia.org] and optical interconnects will be leveraged to hit this.

It's not necessarily the size of the transistors that make chips hard to make these days either (although they are now giving us huge problems with leakage current). It's harder to route the metal between these transistors than it is to pack them onto the silicon. New processors from Intel and AMD have areas with low transistor density just because it was impossible to route the large metal interconnects between them. Before we can take advantage of even smaller transistors we'll need a way for higher interconnect density.

Re:Unlikely (4, Interesting)

geekoid (135745) | more than 6 years ago | (#22893334)

hmmm, I trust the people I know on the floor more then someone whose job it is to say things that maintain consumer confidence.

It would be a stock hit to say "We will be replacing silcone in x period of time if X is any longer then 'right now'.

Some new technologies solve those problems. Technologies in the 'we hobbled something together proof of concept stage, not the I wrote this down on paper stage.

Some of it is impressive, whether or not there will b a practical way to mass produce it is another thing. If not, I can imagine a time in the future where only large entities that can afford 500K a chip will be using them. Or anyone at home that can afford the latest electron microscope, laser, super cooling.

meh, I'm just glad the MHz war is pretty much subsided and we are FINALLY focusing on multi-core.

Not every chip needs speed (2, Insightful)

Enleth (947766) | more than 6 years ago | (#22893234)

I don't know any numbers, but I think I can safely guess that the computer processor business is just a fraction of the whole silicon chip manufacturing business - maybe not a small fraction, but still. And the rest of the industry doesn't need extreme speeds - there are microcontrollers, integrated buffers, logic gates, comparators, operational amplifiers and loads of other $0.05 crap you got in your toaster oven, blender, wirst watch, remote-controlled toy car, printer, Hi-Fi, etc., etc. And there is an obvious priority for those: cheap and reliable. So the silicon is not going anywhere.

Re:Not every chip needs speed (1)

imstanny (722685) | more than 6 years ago | (#22893786)

And the rest of the industry doesn't need extreme speeds - there are microcontrollers, integrated buffers, logic gates, comparators, operational amplifiers and loads of other $0.05 crap you got in your toaster oven, blender, wirst watch, remote-controlled toy car, printer, Hi-Fi, etc., etc. And there is an obvious priority for those: cheap and reliable. So the silicon is not going anywhere.
And let's not forget Solar Cells, which are increasing production like crazy (and is causing silicon prices to increase).

Re:Not every chip needs speed (2, Insightful)

MttJocy (873799) | more than 6 years ago | (#22894002)

Exactly if silicon is going to be phased out anywhere in 4 year (note the IF) it will be in extremely high end supercomputer type devices perhaps a decade or so later this might get enough research combined with economies of scale to hit the high end PC market, maybe another decade may go by and the development dollars earned from this will enable them to enter the price range of the rest of the PC market (note here at this point you will most likely have a motherboard with silicon chips with exception only of the CPU and possibly the GPU for high end gaming machines) even probably 50 years later your digital wristwatch will still be silicon it does not need more.

Simply, silicon may begin to find competitors in the high end market for people with deep pockets but it will not die out in lower end devices for decades yet if ever (we need to come up with some very novel ideas before your wristwatch needs tens of gigahertz of processing power).

Bah! (0)

jpellino (202698) | more than 6 years ago | (#22893252)

There's no way my silicon based chips are on the way out, where do these people get this srtoqa sdj asfjvsv oiasfj jkkj&^$______-........... *no signal*

Cost over Power (0)

Anonymous Coward | more than 6 years ago | (#22893280)

Me: Hi Intel, we can either build you a chip 10x faster than the competition or 10x cheaper.
Intel: we'll take cheaper please

Birth vs. Death (3, Insightful)

192939495969798999 (58312) | more than 6 years ago | (#22893320)

This guy is confused. The BIRTH of the silicon chip is nearly over... now is when it will completely take over our environments. To put it another way: demand for silicon chips is as dead as demand for crude oil, corn, or other staples.

Wrong tag (2, Funny)

Mantaar (1139339) | more than 6 years ago | (#22893368)

This should really be tagged software, shouldn't it?

While we're at it, might add that Duke Bend'Em Forever tag, too...

Yawn (1)

MarkGriz (520778) | more than 6 years ago | (#22893422)

Wake me when they announce the death of the Slashdot dupe [slashdot.org]

Re:Yawn (0)

Anonymous Coward | more than 6 years ago | (#22893518)

Wake me when they announce the death of the Slashdot dupe

That should be around the time they announce the death of the last of the /. readers who don't read the fine summary, much less the fine article linked in the summary.

The second link in the summary links to the article on graphine that you link in your witty and informative post.

ECHO! Echo! echo! (4, Insightful)

Chas (5144) | more than 6 years ago | (#22893428)

This has been getting bandied about every time someone comes up with a new, spiff-tastic technology/material to build an IC out of.

"THIS COULD REPLACE SILICON! WOOT!"

Yet it keeps NOT happening. Again, and again (and again).

The trailblazers keep forgetting, the silicon infrastructure has a LOT more money to play with than a given exotic materials research project. And, in many cases, what's being worked on in exotics can be at least partially translated back to silicon, yielding further improvements that keep silicon ahead of the curve in the price/performance ratio. Additionally, we keep getting better at manufacturing exotic forms of silicon too.

So, until silicon comes to a real deal-breaker problem that nobody can work their way around, I SERIOUSLY doubt that silicon IC is going anywhere. Especially not for a technology that has taken several years, and recockulous amounts of money simply to get a single flawless chip in a lab.

Re:ECHO! Echo! echo! (0)

Anonymous Coward | more than 6 years ago | (#22894298)

The trailblazers keep forgetting, the silicon infrastructure has a LOT more money to play with than a given exotic materials research project.

Not to mention all the money already invested in manufacturing infrastructure for silicon based devices. As long as the industry can keep utilizing that infrastructure in a profitable manner it will do so. They won't begin to move away from silicon in any major way until it becomes unprofitable for them to not do so.

Not so fast... (2, Insightful)

jandersen (462034) | more than 6 years ago | (#22893464)

The transistor was first patented in 1925 (look it up in Wikipedia) and the integrated circuit in 1949 - both fundamental for microchips - but we still use radio valves today, and not just for nostaligic reasons. Silicon will probably hang around for a long time to come, I think.

For something else to replace silicon it will have to not only be better, but so much better that it will justify the investment, or it will have to offer other, significant benefits, like being cheaper to produce, using less power or being smaller. Of these, I think speed is probably the least important, at least for common consumers.

Personally, I still haven't reached the point where my 3 year-old machine is too small or slow - not even near. It wouldn't make sense to upgrade, simply. I think most people see it that way, they would probably be more interested in gadgets than in a near-super computer.

Re:Not so fast... (1)

ScrewMaster (602015) | more than 6 years ago | (#22894102)

I think most people see it that way, they would probably be more interested in gadgets than in a near-super computer.

Well, now that depends. If you mean a supercomputer whose only function is to run Microsoft Office faster ... you're right. Not much point in that. However, if we did have that kind of power in a sub-$1000 computer system, odds are we'll find something way cool to do with it. Something on the order of useful AI, for example.

Four years, eh? Then what? (2, Insightful)

swordgeek (112599) | more than 6 years ago | (#22893530)

Even if the hard limits of silicon circuits are reached in four years, we will NOT be switching to nanotubes, graphene, superconductors, or quantum computing. Any of those technologies are at least a decade away from commercial applications, and 15 years is more likely. If there's nowhere to advance after four more years (and I rather doubt that--we've got too much history proving us wrong), then we'll just grow out. Bigger silicon dies, bigger cache, more cores. Maybe we'll actually hit the terminus of Moore's law, but that won't stop computers from advancing, and it won't magically make any of the alternative technologies mature.

When someone makes a nanotube 80486 that I can buy and use, THEN I'll start to believe we're close to a technology shift. Hell, give me a 4004 - at least it's a product.

Bottom line: We're not there yet.

It will have to be replaced (0)

Anonymous Coward | more than 6 years ago | (#22893546)

Most of the posts above are true about silicon sticking around a while longer and being improved in other ways, but Moore's Law cannot hold true for much longer with silicon. Transistors can only get so small. Features certainly cannot get smaller than one atom, and we're already looking at features that are 5 atom layers thick.

And no matter what you do to silicon, electrons and holes are only going to move so fast through it, and you're only going to be able to fit so many transistors in a certain amount of space. Something will have to replace it. Vacuum tubes had a nice long run too, but when they hit their limits and silicon was ready, silicon took over.

Semiconductor Research Corporation, the people who make the roadmap for all this stuff and have all the technological problems already laid out, say that it'll be a decade or so, not 4 years. But there is a fundamental limit that we are going to hit soon. Charge mobility and atom size are not really adjustable when you're stuck on a certain material.

Reports of my death (0)

Anonymous Coward | more than 6 years ago | (#22893634)

have been greatly exaggerated. - (signed) The Silicon Microchip

Let's see (1)

devilpainteth (700632) | more than 6 years ago | (#22893658)

Let's wait for the future... =)

Think research not computer (0)

Anonymous Coward | more than 6 years ago | (#22893738)

I think an important point raised is, that the death is not a comsumer death, it's a death in silicon chip research, the ventures into making them faster will stop, since it's simply not the optimal path for improvements, and instead the reasearch will focus on using some of the previusly to costly paths that are now becoming more attractive since they don't offer as many problems when downscaling size and upscaling speed.

to put things in perspective ... (1)

utnapistim (931738) | more than 6 years ago | (#22893784)

... why don't we call Nanotubes and superconductors The Microprocessor Killers (TM)?

solar panel production should benifit (1)

Danathar (267989) | more than 6 years ago | (#22893800)

Although I'm no expert, I've been reading that one reason Solar photo-voltaic panels have not dropped in price is due to the fact that much of the silicon used to make them is tied up in chip fabrication.

I wonder if those same silicon wafer production facilities can be converted to make solar panels once the move away from silicon in the microprocessor industry takes place?

If True (1)

Nom du Keyboard (633989) | more than 6 years ago | (#22893856)

If this is true, then the players who are overly committed to silicon may lose ground to those moving to new materials and technologies. It could portend quite a shake up.

Silicon Scaling (2, Interesting)

wilsonjd (597750) | more than 6 years ago | (#22893876)

Silicon scaling will run out. We will reach a point where we can no longer make working circuits any smaller, but it will NOT be in the next four years. 45, 32, 22 nm circuits are already in the lab. 16nm (which may be the limit,) is expected to be in production by 2018 (10 years from now.) After 16nm, quantum tunneling may be a problem. http://en.wikipedia.org/wiki/16_nanometer [wikipedia.org]

Intel thinks we may hit the limit by 2021. http://news.zdnet.com/2100-9584_22-5112061.html [zdnet.com]

Re:Silicon Scaling (1)

Dunbal (464142) | more than 6 years ago | (#22893962)

And THEN you just figure out how to fit more blocks of chips together... quad cores, 8x cores, 16 x cores... various of multi-cored chips linked together on a motherboard, etc.

Need for resistance to EM Pulse? (1)

SL1200MKII (1263800) | more than 6 years ago | (#22894038)

While I agree that silicon chips are not going anywhere soon, there are applications where they are not the most suitable to use. Most of the critical infrastructure that powers most advanced nations today utilize silicon chips in one way or another. Strategically planned attacks on such infrastructure that utilize EM pulses could seriously cause widespread disruption. I'd like to see new technology that is resistant to such attacks. I believe carbon nanotube based technology is resistant to EM pulses.

I won't believe it... (1)

5of0 (935391) | more than 6 years ago | (#22894162)

...until Netcraft confirms it. Long live silicon!

Homebrew graphene transistors (2, Interesting)

the_kanzure (1100087) | more than 6 years ago | (#22894194)

SciAm is running an April 2008 article on graphene, so here are my notes on graphene fabrication [heybryan.org] . This is pretty neat, and worth some amateur experimentation. You can make the AFM/STM for ~$100 USD [heybryan.org] . As for graphene, there are some instructions on that page for chemically synthesizing it, or just use pencil graphite and write over a piece of paper. Another cool idea is figuring if we can use mechanical force to use a very thin pencil tip to write a circuit. JohnFlux in ##physics on freenode mentions that resistors could be used as a poor man's piezo, just heat up the metal (or perhaps pencil) and it will move. It will move very slowly. But a start.

The personal automobile is dead (4, Informative)

Kjella (173770) | more than 6 years ago | (#22894296)

...because the top speed has barely moved in the last decades. The commercial airplane is dead because the top speed has gone DOWN after the Concorde landed. WTF? If we really hit the hard limits of silicon, then there won't be half a dozen techs for terahertz speed waiting. It might mean that the next generation WON'T see improvements of many orders of magnitude like we have, that's it. Computers will be something that operate at some given performance and the world will shrug at it. In short, the world won't collapse if this completely uncharacteristic development comes to an end. And even then I suspect it will go on elsewhere, did you see flashmicro's 900GB 2,5" flash disk? Yes, at ungodly prices but I think we have a long way to go yet...

The Death Of The Death Of (1)

crevistontj (1032976) | more than 6 years ago | (#22894842)

I'm getting really sick of Slashdot spinning articles about possible future technology into "The Death Of (current technology)."

Only half of the story. (2, Insightful)

IorDMUX (870522) | more than 6 years ago | (#22895058)

Some very intelligent researchers at the Institute of Physics' Condensed Matter and Material Physics Conference came to some very intelligent decisions about the future of CPU's... but this is hardly the end of the silicon chip.

In addition to some of the points made by other posters (Silicon CPU's will live on in smart systems, cheap systems, handheld systems, etc.), there is a whole world of silicon chips that are *not* CPU's! Analog and mixed signal circuits need highly linear devices--not just switches that turn on and off--which current silicon technology provides wonderfully. Our current analog design technology has nowhere near exhausted the possibilities on the tapestry that ten/twenty year old silicon fabrication technologies provide.

Maybe graphene, nanotubes, or the Next Big Thing will change the high performance CPU niche, but silicon still provides everything we can manage to use for the rest of the IC world.

Besides, I bet that graffiti [ieee.org] will be quite a challenge with nanotubes.

I'm putting my money on diamonds... (1)

slewfo0t (679988) | more than 6 years ago | (#22895114)

Seems to me that the new advancements in diamond manufacturing will pave the way for diamonds to be the next step.

Great articles on it...

http://www.geek.com/81ghz-diamond-semiconductor-created/ [geek.com]
http://www.wired.com/wired/archive/11.09/diamond.html [wired.com]

Slew
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>