Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel, IBM Announce Chip Breakthrough

kdawson posted more than 7 years ago | from the dueling-press-releases dept.

Intel 112

Intel announced a major breakthrough in microprocessor design Friday that will allow it to keep on the curve of Moore's Law a while longer. IBM, working with AMD, rushed out a press release announcing essentially equivalent advances. Both companies said they will be using alloys of hafnium as insulating layers, replacing the silicon dioxide that has been used for more than 40 years. The New York Times story (and coverage from the AP and others) features he-said, she-said commentary from dueling analysts. If there is a consensus, it's that Intel is 6 or more months ahead for the next generation. IBM vigorously disputes this, saying that they and AMD are simply working in a different part of the processor market — concentrating on the high-end server space, as opposed to the portable, low-power end.

cancel ×

112 comments

Sorry! There are no comments related to the filter you selected.

Two breakthroughs in one day? (4, Insightful)

zero-one (79216) | more than 7 years ago | (#17785062)

With this breakthrough and that other one [slashdot.org] perhaps Moore's Law needs updating.

This is a big deal (5, Interesting)

noopm (982584) | more than 7 years ago | (#17785118)

As a graduate student researching this field, this is an amazing bit of news! - The intel high-k announcement is a *major* breakthrough, and a new, disruptive technology for chip technology especially as far as the the introduction of new materials in the Fab are concerned (and trust me, Fab engineers are paranoid about such kinds of shifts). It essentially involves replacing the SiO2 dielectric gate insulator with a new class of materials, very likely Nitrided Hafnium Silicates (though they have not publicly acknowledged the silicate part, they just mention it as a compound of Hafnium - it is the leading contender in the field).

The high-k film can be made physically thicker than the very thin SiO2 layer (which is only around 12 Angstroms thin at the moment, making it leak like a sieve) without messing up the capacitance requirements for the transistor. The introduction of new metal gate instead of the classic poly-crystalline silicon (called poly) is also abig deal, and there is greater secrecy on what those materials are. The wikipedia article on high-k has the details. http://en.wikipedia.org/wiki/High-k_Dielectric [wikipedia.org]

Re:This is a big deal (1)

obious (945774) | more than 7 years ago | (#17787584)

As an CompE undergrad, my studies revolving around SiO2 have all of a sudden become history lessons

Re:This is a big deal (1)

WhoBeDaPlaya (984958) | more than 7 years ago | (#17787936)

So back to the roots huh? MOSFETS had that first M for a reason ;)

Re:This is a big deal (1)

NoMoreFood (783406) | more than 7 years ago | (#17791244)

Great... I work in the nuke industry. A bigger demand for hafnium is going to make our subs cost like 3 zillion dollars instead of 2 zillion :P

Re:This is a big deal (1)

rbarreira (836272) | more than 7 years ago | (#17791668)

What do you think about this post? [slashdot.org]

Re:Two breakthroughs in one day? (4, Funny)

kharchenko (303729) | more than 7 years ago | (#17785496)

Yes, Moores' law didn't account for dupe postings. If we could just post this news a few more times today we could jump decades ahead in terms of transistor density! Keep up the pace dear editors :)

Re:Two breakthroughs in one day? (0)

Anonymous Coward | more than 7 years ago | (#17785758)

This is a more complete new posting than the last -- it's the only explanation if you take into account the slashdot "department" that published it.

Dupe reply (1)

Kuvter (882697) | more than 7 years ago | (#17788414)

Why it's Moore's Law a law? It just sounds like a theory to me, it just has been surprisingly accurate to date, that's all.

Re:Dupe reply - why is moore's law a law? (1)

arbitraryaardvark (845916) | more than 7 years ago | (#17792484)

"Why it's Moore's Law a law? It just sounds like a theory to me, it just has been surprisingly accurate to date, that's all."

  Theories that remain suprisingly accurate over time tend to be known as laws. Unlike, say, axioms, where one counterexample could break a paradigm, a law only has to work often enough to be useful. If a prediction works 95% of the time,and fails to account for 5% of the data, we can still call that a law. Feel free to call it Moore's pretty damn good conjecture. It's not intended to be rigorous,and we don't need to claim that that it will work for the next 10,000 years. It's enough to understand the general point that the cost of an information processing system is cut in half every two years or so by developing technology, and that can only have profound changes on culture and economy.
It's useful to be familar with a couple of additional concepts: 1) austrian economics, which shows how markets function to drive technological change,and how technological change functions to drive markets. 2) the singularity. aka "the rapture for nerds", the singularity
is the idea that the rate of technological change is speeding up, driving innovation in ever shorter cycles, in a hyperbolic curve (y=x squared), so that at some point, probably in this century, the rate of change will be going basicly straight up,and that on the other side of the singularity, things look weird.
So there are at least three options:
1) Moore's law is an overstatement in the long term. At some point physical limitations set in, the low hanging fruit has already been picked,and a new plateau is reached where the cost of information systems is low compared to today, but has leveled out and is no longer decreasing.
2) Moore's law will continue to be suprisingly accurate for many years to come.
The cost of information systems will keep decreasing by about half every two years,and that will continue to drive economic transformation and social change.
3) Moore's law is descriptive at the elbow of the curve, where we live now, but as change builds on change Moore's law will be found to be wildly conservative,and the cost of a given information system will drop by half in shorter and shorter cycles, until information system costs approach zero, with consequences that include AI, space travel, life extension,
gene hacking, and stuff we can barely imagine now.

As formally stated in terms of doubling of transisters on a chip, or in terms of the cost of a transister, per period of time, moore's law only applies to the time since the invention of the transister and some unknown point in the future at which it no longer applies, perhaps because we use something else besides transisters. It remains useful in describing the period
from about 1950 (or 1970) through 2007 up to at least until either limits are reached or the pre-singularity effects kick in and shorten the doubling time. I expect measurable pre-singularity effects by 2012. Some would argue Moore's law is itself an example of noticable pre-singularity effects.
4) two cups of Moore, 1/2 cup of salad dressing = moore slaw

Re:Two breakthroughs in one day? (1)

ozbird (127571) | more than 7 years ago | (#17786742)

... perhaps Moore's Law needs updating.

Moore's "Law" isn't - it's more a rule-of-thumb.

Not news (3, Insightful)

LighterShadeOfBlack (1011407) | more than 7 years ago | (#17785072)

Sorry but why is this being reported again now? We already knew Intel and IBM had achieved a 45nm process and that it would be coming to mass-market chips in 2007-08. It's 2007 and it's here. Hooray and all that, but is a company following through on its claims really so shocking that it constitutes being reported again... twice [slashdot.org] ?

Re:Not news (2, Insightful)

unc0nn3ct3d (952682) | more than 7 years ago | (#17785152)

pretty sure this article was more about the switch to Hafnium as an insulator as apposed to the 45nm technology. Also the fact that they are using a new silicon substrate over the existing standard...

Re:Not news (1)

LighterShadeOfBlack (1011407) | more than 7 years ago | (#17785192)

The hafnium and high-k metal gates are pre-requisites for the 45nm process. The two articles highlighted might vary somewhat in focus but they're definitely reporting the same thing.

Re:Not news (2, Insightful)

Bender_ (179208) | more than 7 years ago | (#17785396)


That is not true. There will be a number of companies doing 45nm without high-k and metal gates.

Re:Not news (1)

LighterShadeOfBlack (1011407) | more than 7 years ago | (#17786098)

Well whether they use that particular method or not, the point is that the existing materials Intel are using for the 65nm process apparently aren't up to the task at the 45nm scale. If that's wrong well I guess I've been misled by the articles I've read on the subject.

Re:Not news (2, Informative)

Bender_ (179208) | more than 7 years ago | (#17786426)


The alternative would have been just to shrink the devices, gain less on performance and use circuit techniques to battle parasitic power consumption. That is what most companies in cost sensitive markets are going to do.

Re:Not news (1)

unc0nn3ct3d (952682) | more than 7 years ago | (#17785196)

Hhehe, and of course If I would have read the other article would have seen it too was about hafnium and the new substrate as well.. So I retract my above statement, you're comment about this not being news is totally correct..

Re:Not news (1)

Workaphobia (931620) | more than 7 years ago | (#17787626)

> "is a company following through on its claims really so shocking"

Yes. Yes it is

Chip Breakthrough.... (3, Funny)

Prysorra (1040518) | more than 7 years ago | (#17785088)

But can they keep up with Lays? :D

Please tag article... (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17785092)

Would people please tag article "frontfuckingpagedupe". Thank you.

Re:Please tag article... (2, Informative)

dreddnott (555950) | more than 7 years ago | (#17785314)

This article's summary is far more accurate and informative than the other one. I posted several times in the older post to help clear up some misinformation (the article it linked to misspelled hafnium as "halfnium" and only mentioned it once, and never mentioned IBM or AMD).

RFI? Electromigration? (2, Insightful)

caitriona81 (1032126) | more than 7 years ago | (#17785098)

But how much further will that get them before RFI makes it a moot point? At that small of a pathway, I'd think that random radio signals and electrical noise would be disastrous.
Also, how well does this survive long term? Is it resistant to electromigration [wikipedia.org] over time?
All great to hear, but I'm not sure how long this will let them keep pace with Moore's law, at best it buys a couple more years of progress on current processor designs I guess.

Re:RFI? Electromigration? (1)

pilgrim23 (716938) | more than 7 years ago | (#17785232)

Good point. That is the very reason NASA sticks to 386 and earlier vintage computers from what I have read. Outside of the insulating atmosphere, cosmic rays pass throuh and have a tendency to be larger then the cicuit gap. This makes for some interesting and adverse additions to any computation.
      Every now and then the normal press reports new advances in biological comuters, light based, heck I even read of a wooden one once... Nothing it seems ever comes of it though except interesting grad student papers Electronic comps are the work horse for now, and, for the future it seems we will just see....Moore of the same ;)
 

Re:RFI? Electromigration? (1)

644bd346996 (1012333) | more than 7 years ago | (#17785712)

NASA sends up several thinkpads with every shuttle launch. They have been doing so since 1993, and they have upgraded many times to more modern machines. STS-114 was the first to fly several A31p thinkpads with 1.8Ghz p4s.

For the mars missions and things like it, radiation hardened processors like the RAD750 are used. It seems that everything in use is at least pentium class.

Re:RFI? Electromigration? (1)

GigsVT (208848) | more than 7 years ago | (#17785864)

The shuttle internal systems run on obselete crap. That's why they send up laptops.

Re:RFI? Electromigration? (3, Informative)

pnewhook (788591) | more than 7 years ago | (#17786104)

No the shuttle and station run on older stuff because those processors are radiation immune, and they are critical systems that cannot crash. The laptops are for everyday work that do not interface to the shuttles systems. If they crash from the radiation, the astronauts simply put it aside and grab another one.

Re:RFI? Electromigration? (1)

GigsVT (208848) | more than 7 years ago | (#17788304)

The shuttle used hand woven magnetic core memory until 1990.

It's obselete crap, even after the 1990 upgrade. It was designed in the 60s and the only reason it wasn't decomissioned 3 decades ago was political, no one wanted to admit they dumped billions of dollars down the toilet.

Re:RFI? Electromigration? (1)

pnewhook (788591) | more than 7 years ago | (#17789904)

The shuttle used hand woven magnetic core memory until 1990. It's obselete crap, even after the 1990 upgrade. It was designed in the 60s and the only reason it wasn't decomissioned 3 decades ago was political, no one wanted to admit they dumped billions of dollars down the toilet.

You should really research things before placing an opinion. It would really reduce the amount of bullshit you write.

You can't really fly anything beyond Pentium class of processor because you get radiation upsets. Even at that you have to disable the cache. The shuttle uses proven technology, which is by necessity older, because the computers have to be fault tolerant.

Re:RFI? Electromigration? (2, Insightful)

mrhartwig (61215) | more than 7 years ago | (#17789990)

The shuttle used hand woven magnetic core memory until 1990.

Yep. Stable, information-retaining (unfortunately, it even retains info after immersion in seawater), and basically immune to cosmic ray disruptions. Which doesn't require a lot of error-correction circuitry.... Not terribly data-dense or fast compared to semiconductor (part of the reason to replace it, after all) but it works.

It was designed in the 60s...

Actually, the computers themselves were designed the 70s, with updates in the 80s; core memory (I don't think you meant that) was actually from the 40s & 50s, with significant updates afterwards. You know, of course, that it took years of system integration testing after the new HW was finished before the new semi-conductor memory (along with the upgraded CPUs, etc.) were flown? Some silly idea NASA has about trying to make sure stuff that keeps people alive isn't broken in any way. ...the only reason it wasn't decomissioned 3 decades ago....

Right. If it flew in 90 (might have actually been 1991 iirc, but maybe not) it's still only been flying for 17 years. How do you decommission something 13 years before it first flew?

Just because something's old doesn't mean it's not useful. There are also cost/benefit factors in replacement; in this case (probably; I don't pretend to know all of the reasons) external requirements that have nothing to do with HW (like testing regulations) greatly increase the cost of replacement. Plus, you have the whole anytime-you-change-you-increase-risk problem; there's a reason that "if it ain't broke, don't fix it" is an adage.

Re:RFI? Electromigration? (3, Insightful)

stevesliva (648202) | more than 7 years ago | (#17786300)

The shuttle internal systems run on obselete crap.
Obselete, incredibly reliable, utterly adequate rock-solid gold. If it ain't broke, don't fix it. Launching enormous rockets with software control is possible to screw up [wikipedia.org] . Given the choice, I'd rather fly with the proven computers.

Re:RFI? Electromigration? (1)

GigsVT (208848) | more than 7 years ago | (#17788316)

That's why they replaced some of the most obselete parts of it (like the hand woven magnetic core memory) in 1990, right?

It's crap. Everyone knows it's crap. It would have been shelved a very long time ago if it weren't for politics.

Do you refuse to use any bank that doesn't use a Univac? You realize Univac came out 10 years before the shuttle computer was designed, right? That's how obselete the shuttle computer is.

Re:RFI? Electromigration? (1)

SmittyTheBold (14066) | more than 7 years ago | (#17789436)

You realize Univac came out 10 years before the shuttle computer was designed, right? That's how obselete the shuttle computer is.
In actuality, it's 10 years more obsolete than the Shuttle's computers, and as we all should know, ten years is an eternity in the computer world.

Yes, if you must know, I thought your comment deserved another with an equally absurd thesis.

Re:RFI? Electromigration? (1)

mrhartwig (61215) | more than 7 years ago | (#17790070)

Oh, yeah -- I can be more absurd than you. Since all computers are based on transistors, we should scrap them all. They are, after all, based on technology designed in the 1940s and absolutely must be obsolete.


Nyah, nyah.

interesting footnote
I learned something while looking in Wikipedia to find out when the "Univac" (there were more than one, of course) was released so I could compare it to the IBM System/360 (from which design, eventually, came the Shuttle CPUs). UNIVAC I, from 1951, used tanks of liquid mercury for memory. UNIVAC II (1958) had core memory. Wow!

Re:RFI? Electromigration? (1)

TheGavster (774657) | more than 7 years ago | (#17791116)

The difference between banking on a Univac and flying a spaceship with a radiation-hardened 386 is that improvements on the 386 aren't necessarily reflected in spaceship-flying performance, whereas improvements on the Univac show distinct benefits in banking. The laws of physics work just like they did the first time we put a shuttle in space; on the other hand, transaction volume and reporting complexity has increased tremendously since the first mechanical accounting machines. When we build a new shuttle with more complicated flight control requirements, we'll need new computers to fly it. Until then, there's really no sense screwing with the known-good system.

To head off the anticipated "but the shuttle is crap" argument, I'm not going to contradict that. As a reusable spacecraft, the shuttle is woefully obsolete and inappropriate for the task. We should have started designing and building a new one years ago. The limited capabilities of the shuttle have in turn led to some questionable design decisions on the ISS and limited development of out capability to do something other than go to LEO and come back. I'm just saying that if you're going to fly a space shuttle, the stock flight computer is all you need.

Re:RFI? Electromigration? (0)

Anonymous Coward | more than 7 years ago | (#17786294)

Outside of the insulating atmosphere, cosmic rays pass throuh and have a tendency to be larger then the cicuit gap.

No, cosmic rays go through the whole planet -- a little atmosphere won't stop 'em. It's radiation in general, and while they run radiation-hardened versions of those chips, it's a harder to harden processors that are denser or just more complex.

That and they just don't need more horsepower -- for the extremely specific purposes they're used for, they may as well be ASICs. They do take fairly current (and ruggedized) laptops up with them for general-purpose use.

Re:RFI? Electromigration? (1)

myurr (468709) | more than 7 years ago | (#17785238)

But that may be all it takes. It's not like they're going to suddenly pack up their bags and stop researching this stuff. Each advance only ever buys time before the next advance is required.

Re:RFI? Electromigration? (1)

ssista537 (991500) | more than 7 years ago | (#17785286)

One reason why the industry moved to Cu interconnects as compared to Al is due to lower electromigration. Due to the specific microstructure in Cu and also due to its low bulk diffusion it is much more resistant to Electromigration. Having said that as the nodes start shrinking this might become a problem in the future........ May not be at the 45 nm node though.

Re:RFI? Electromigration? (3, Interesting)

Kohath (38547) | more than 7 years ago | (#17785328)

There are many, many people spending their careers solving those types of problems.

It's not really interesting when someone does something in 45nm. It's interesting when enough of the problems with 45nm are solved for it to actually be practical to make 45nm-based chips.

So, the answer to your question is: someone figured it out already.

Electromigration is only an issue at high current densities. For clarification, "high" is defined as the density where electromigration becomes an issue. The solution is use less current, use more metal so the current is less dense, or find a material that can handle higher current density.

Re:RFI? Electromigration? (1)

caitriona81 (1032126) | more than 7 years ago | (#17785370)

Electromigration takes some time to show up though. If they are just announcing this process now, what problems are going to show up 3, 4, 5 years down the road?

Re:RFI? Electromigration? (1)

Kohath (38547) | more than 7 years ago | (#17785402)

It depends on whether the engineers do their jobs and check their current density. If they do, no problem. If not, some percentage of the chips with eventually fail. Worrying about it doesn't help. Engineers checking it is the only thing that helps.

Re:RFI? Electromigration? (1)

skoaldipper (752281) | more than 7 years ago | (#17785588)

It's not really interesting when someone does something in 45nm. It's interesting when enough of the problems with 45nm are solved for it to actually be practical to make 45nm-based chips.
Well said. Even the article mentions it as (I believe) "evolutionary not revolutionary". The standard transistor design has not changed to my knowledge but only one (or two) substrate mediums from which it's made. However, I think the molecular (DNA) logic gate model would qualify as (at least) "revolutionary" (if not "evolutionary" as well). And to your point, I believe even a molecular computer has already been demonstrated, but in realistic terms is far from practical.

Re:RFI? Electromigration? (4, Funny)

cheezedawg (413482) | more than 7 years ago | (#17785604)

Golly- I hope that all of the PhDs working on Intel's 45nm process are reading /. today. I bet they never thought about that.

Re:RFI? Electromigration? (1)

elteck (874753) | more than 7 years ago | (#17785902)

RFI: Actually the smaller te circuit, the better it's RF Immunity, because the smaller the wiring the less effective its antenna efficiency is.
But I can assure you, since we crossed the 100MHz barrier, a lot has been done to improve RF immunity. Todays system boards and chips are RF-designs, also to keep reflections small and maintain signal integrity. All traces are transmission lines, which have good RF-Immunity as well.

Electro migration: This is the reason why switching currents (also known as shoot through current / overlap current) have to be reduced with feature size. Wiring already occupies a lot of die area.

Re:RFI? Electromigration? (1)

Manchot (847225) | more than 7 years ago | (#17785962)

Testing for electromigration issues is standard operating procedure for companies like Intel. They basically pump insanely high amounts of current densities through their devices and see how long they take to fail. Then, they can use that figure to extrapolate how long they'll take to fail under normal conditions. Basically, they can test years worth of damage in days. Asking whether Intel checks for this is like asking whether car companies check to see if the engines start up before selling them. Of course they do.

Is this kdawson's first front page dupe (3, Funny)

antifoidulus (807088) | more than 7 years ago | (#17785100)

Welcome to the club! On your application as editor, did you have to swear that you don't actually read slashdot as a precondition for employment like all the other editors?

Re:Is this kdawson's first front page dupe (1)

dreddnott (555950) | more than 7 years ago | (#17785374)

Hey now, you should be positively thanking him. The previous posting had an awful summary that didn't mention IBM, AMD, or the fact that the new High-K replacement was based on hafnium (they misspelled it as halfnium in the actual article, which was even worse).

At least with this summary we'll get cool arguments about Intel vs. AMD and IBM and conspiracy theories and stuff.

Re:Is this kdawson's first front page dupe (2, Funny)

QuickFox (311231) | more than 7 years ago | (#17785888)

they misspelled it as halfnium
That's no misspelling, it is halfnium! You could have understood this yourself, if you hadn't been so quick to dole out criticism, and instead had spent a second considering the fact that they reduced the size from 90 nm to 45 nm.

Re:Is this kdawson's first front page dupe (1)

gbjbaanb (229885) | more than 7 years ago | (#17786276)

apart from the dupe, kdawson is possibly the best editor they have. I for one, blame our new firehose overlords - so its our fault for voting for 2 of the many posts about this news.

printer/ad free version (2, Informative)

farker haiku (883529) | more than 7 years ago | (#17785106)

here [nytimes.com]

Axiom? (4, Insightful)

rumith (983060) | more than 7 years ago | (#17785112)

The Intel announcement is new evidence that the chip maker is maintaining the pace of Moore's Law, the technology axiom

I thought it's an empiric law; the definition of axiom is quite different from that.

Intel said it had already manufactured prototype microprocessor chips in the new 45-nanometer process that run on three major operating systems: Windows, Mac OS X and Linux.

Again, I thought it's the operating systems who run on microprocessors, not vice-versa. And I [not being a kernel developer, though] can't see any reason for an OS to stop functioning on a new processor model if the architecture is intact and no serious hardware-level bugs are introduced.

Re:Axiom? (1)

forkazoo (138186) | more than 7 years ago | (#17785520)

Intel said it had already manufactured prototype microprocessor chips in the new 45-nanometer process that run on three major operating systems: Windows, Mac OS X and Linux.

Again, I thought it's the operating systems who run on microprocessors, not vice-versa. And I [not being a kernel developer, though] can't see any reason for an OS to stop functioning on a new processor model if the architecture is intact and no serious hardware-level bugs are introduced.
Well, yeah. That's pretty much the point. Usually, a first go a new processor has serious hardware bugs and doesn't run very well. Running existing O/S's in real silicon is a very important step in the creation of a new processor. If things are going smoothly with the design, and they have it running real code more quickly than they figured, then it speaks well to the possibility of them being able to ramp up production easily.

Re:Axiom? (1)

grimJester (890090) | more than 7 years ago | (#17785670)

The OS'es running on the prototypes is probably meant to show that there are functioning processors made using the new process, as opposed to a couple transistors in a lab. A kind of proof this isn't just vaporware to boost stock prices.

Re:Axiom? (1)

igrokme (254668) | more than 7 years ago | (#17787346)

To Moore and logicians, it's an empiric law. To many business plans (Intel's and AMD's not the least), and arguably to the technology sector as a whole, it has been made axiomatic.

Moore himself has argued against this usage but he does not control what assumptions people stake their business plans on, even when they are based on his empiric laws.

Re:Axiom? (0)

Anonymous Coward | more than 7 years ago | (#17787442)

That's the point for announcing that! It shows that the process is stable enough to have a processor team go through all the motions necessary for designing and verifying the processor--including all of the simulations necessary for validation of high-speed circuits. All of it implies the process is stable and correlates with models used for simulation--not just that they laid it out, taped it out, manufactured 1000 of them and 1 happened to work by sheer probability. BZZZT! Just the opposite ... the process is ready to rock and roll ..... believe it.

AH HA! (1)

frankwolftown (968794) | more than 7 years ago | (#17785212)

Take that Gordon Moore!
And in your face space coyote!

Rename? (5, Funny)

somegeekynick (1011759) | more than 7 years ago | (#17785264)

What, now Silicon Valley becomes Hafnium Valley?

Re:Rename? (1, Funny)

Anonymous Coward | more than 7 years ago | (#17785398)

Sure. And in another 30 years or so, it'll be Quaternium Valley. Oh wait [wikipedia.org] , that can't be right...

Re:Rename? (1)

Original Replica (908688) | more than 7 years ago | (#17786432)

What, now Silicon Valley becomes Hafnium Valley?

I know silicon is a pretty common element, how difficult is it to find hafnium? If it is rare, could this lead to super expensive chips?

Re:Rename? (1)

Grishnakh (216268) | more than 7 years ago | (#17786860)

Probably not. From my understanding of this new tech, silicon will remain the substrate of the chips. Hafnium is only used as an insulating material layered on the top. So the quantities of hafnium will be extremely small in relation to the amount of silicon. Along with a smaller (45nm) process, the total amount of hafnium in a single chip should be quite small.

Re:Rename? (2, Informative)

Mspangler (770054) | more than 7 years ago | (#17790800)

From webelements:

"Most zirconium minerals contain 1 to 3% hafnium. Hafnium is a ductile metal with a brilliant silver lustre. Its properties are influenced considerably by the impurities of zirconium present. Of all the elements, zirconium and hafnium are two of the most difficult to separate. Hafnium is a Group 4 transition element.

Because hafnium has a good absorption cross section for thermal neutrons (almost 600 times that of zirconium), has excellent mechanical properties, and is extremely corrosion resistant, it is used for nuclear reactor control rods.

Hafnium carbide is the most refractory binary composition known, and the nitride is the most refractory metal nitride (m.p. 3310C)."

Intel is not going to be burned by thermal problems again, and you can also hide behind your CPU if "the big one" goes off in the neighborhood. OK, several CPUs and a water tank. But still.

Most efficient.

Last price I could find is $150/pound.

Re:Rename? (2, Funny)

autophile (640621) | more than 7 years ago | (#17787250)

What, now Silicon Valley becomes Hafnium Valley?

Let's hope that real estate prices get cut in haf :(

--Rob

Love the picture (0)

Anonymous Coward | more than 7 years ago | (#17785296)

That's got to be the biggest 45 nanometer wafer I've ever seen! And probably the most expensive dud, since that guy ought to know that there's no point in dressing up in a bunny suit if you aren't wearing a hood.

Whaa? (3, Insightful)

Godji (957148) | more than 7 years ago | (#17785388)

If there is a consensus, it's that Intel is 6 or more months ahead for the next generation. IBM vigorously disputes this, saying that they and AMD are simply working in a different part of the processor market

Didn't read TFA, but is it possible to have a consensus with one party vigorously disputing it?

Re:Whaa? (0, Funny)

Anonymous Coward | more than 7 years ago | (#17785490)

It is the consensus of my friends that I am a jerk. I vigorously dispute this.

'course! (1)

chris_eineke (634570) | more than 7 years ago | (#17785564)

They have a consensus about disputing each other. :)

Re:Whaa? (2, Insightful)

UltraAyla (828879) | more than 7 years ago | (#17785938)

It would seem to be consensus of the analysts, but who knows how accurate that is if one company is disputing the information leading to the consensus.

Re:Whaa? (0)

Anonymous Coward | more than 7 years ago | (#17786076)

Umm, yes? If the consensus is between analysts.. IBM doesn't have to agree and can claim any old thing (and maybe they're right).

Re:Whaa? (0)

Anonymous Coward | more than 7 years ago | (#17789582)

You don't have a brother, don't you?

How long for this to reach laptops? (0)

Anonymous Coward | more than 7 years ago | (#17785394)

I am thinking of buying a new laptop soon. I have a few questions:

1. Is this technology applicable to laptops?
2. If so, how long will it take for it to be integrated into laptops?
3. Will it make them more or less expensive?
4. Will it be a huge jump in performance, or a smaller one?

And most of all, would it be ok to go ahead and get a laptop now or better because of either cost or performance to wait until they have integrated this into a laptop?

Re:How long for this to reach laptops? (3, Insightful)

DrSkwid (118965) | more than 7 years ago | (#17785480)

never buy anything

Re:How long for this to reach laptops? (1)

dimeglio (456244) | more than 7 years ago | (#17785570)

Personally, I never base my purchasing decisions on announcements but rather on my current needs. If I need a laptop, I'll buy whatever's available now which would meet my needs and try to get it at the best possible price. Otherwise, I'll never but anything 'cause by the time that new technology is available commercially, something better is announced.

Re:How long for this to reach laptops? (1)

Jesus_666 (702802) | more than 7 years ago | (#17788912)

True. I am delaying a laptop purchase right now, but that's just because I don't want to buy an Apple laptop immediately before a new version of OS X comes out again. I had enough fun with using 10.3 in a Java 1.5-centric university to wait out a couple months for the next version. But that's arguably an Apple-user-who-doesn't-want-to-buy-retail-OS X-specific problem.

Generally speaking, if you don't plan on relying on a proprietary system there's not much to hold out for, unless $HARDWARE with $FEATURE comes out $SOON and you really want/need $FEATURE.

Re:How long for this to reach laptops? (1)

2nd Post! (213333) | more than 7 years ago | (#17786026)

1) Yes
2) Next year
3) More expensive
4) How much are you willing to spend?

hafnium (0)

Anonymous Coward | more than 7 years ago | (#17785412)

is it because hafnium is haf-thick? :-)

Hafnium? That's weak ... (0)

Anonymous Coward | more than 7 years ago | (#17785428)

Wait until I produce my super-duper fulium-insulator chips!

Re:Hafnium? That's weak ... (1)

ChrisMaple (607946) | more than 7 years ago | (#17786032)

Don't you mean Holmium, which is actually an element like Hafnium?

Moore's Law is Dead! Or not! (3, Funny)

mschuyler (197441) | more than 7 years ago | (#17785552)

The funny thing about this is that every few weeks you read some article that says, "Yup! That's it! We simply cannot get any more out of Moore's Law! It's dead."

Then a couple weeks later someone says, "Yup! We're gonna squeeze a few more years out of Moore's law. New advance! It isn't dead!"

Moore's Law is like the Energizer Bunny. It just keep's going.

Re:Moore's Law is Dead! Or not! (1)

dvice_null (981029) | more than 7 years ago | (#17785788)

I think they (Intel) estimated at some point that the Moore's law would work at least to 2015. At that point they would need to start working with something smaller than atoms to keep it up.

But of course the processor development can still continue after that. We could for example stack many layers on each other to get a 3d chip.

Or who knows if we learn how to manipulate the particles of the atoms (or something similar) and create a chip using those.

Re:Moore's Law is Dead! Or not! (0)

Anonymous Coward | more than 7 years ago | (#17785928)

Depends on the definition. If you take the official definition, "number of transistors on an integrated circuit for minimum component cost doubles every 24 months", then Moore's Law is still valid.
To consumers and even many geeks, it is more common to think of Moore's law as saying for the same price in about 18 months, you can get a computer twice as fast. To keep this economic expectation sort of alive, the industry has gotten very desperate. They went multi-core and changed the specs to take that into account. Very few kernels and programs take advantage of multi-core, so for most people a 3 year old computer will perform almost just as well as a new computer.

Re:Moore's Law is Dead! Or not! (1)

mschuyler (197441) | more than 7 years ago | (#17786414)

If you change the definition to something Moore never said, then of course it might not fit.

Moore's original definition had to do with number of transistors on "an integrated circuit." The original graph didn't even specify size. (Goto Intel.com, search Moore's Law, all that stuff is there.) That engineers have been unable to keep up with exploiting the law isn't really all that surprising or uncommon. However, the interesting thing about Moore's Law is that if you extrapolate the graph backwards in time, it still fits. Take the original switches in the Hollerith card reader used to tabulate the 1890 census, through all the iterations from electrical circuits (like in the old telco switching rooms)to vacuum tubes of the Eniac to transistors of the size used in two-transistor radios to core memory to the 45 nm chips of today or tomorrow and the graph is boggling. The Law has been functioning since the 1890's. And there is no real reason to think it can't keep going, if you don't fixate on silicon. Every time Moore's Law has met an impasse in the past, e.g. vacuum tubes, the technology changes the medium to something else, first to transistors, then to integrated circuits. Moore has said he thinks the law will go to 2015, but that's on silicon. The next medium will carry it further, whether that's some sort of 3D or atomic, or even biological memory. Kurzweil.com has a lot of good stuff on what's about to happen.

So the fact that a word processor works as well on a three year old computer is not really the point of the law. What the law will allow is the next generation of word processor, one you can talk to instead of type in, one that will quickly and accurately translate what you say into radically different languages, one that fits in your eyeglasses and projects a screen in front of you, or one that is simply implanted. To say 'when will this be ready for my laptop and will it lower the price (somewhere on this thread) is not the right question.

You're not going to need a laptop any more.

Re:Moore's Law is Dead! Or not! (3, Funny)

noidentity (188756) | more than 7 years ago | (#17787220)

"Moore's Law is like the Energizer Bunny. It just keep's going."

Moore's Law is like the inappropriate apostrophe. It just won't die.

Diamonds are next.... (1)

HerculesMO (693085) | more than 7 years ago | (#17785630)

Silicon is inferior to industrial diamonds in so many ways, I'm wondering when they will start being used in processor design.... read about it years ago, so perhaps this is the first step towards.

Re:Diamonds are next.... (1)

TeknoHog (164938) | more than 7 years ago | (#17785830)

In other words, diamonds are the geek's best friend!

Re:Diamonds are next.... (1)

Bender_ (179208) | more than 7 years ago | (#17785842)


Actually I believe there are only two properties of diamond that are superior to silicon in respect to electronic application: Heat conductivity and band gap.

The disadvantages are numerous, starting with the very basic fact that there is no known n-type dopant for diamond.

Re:Diamonds are next.... (2, Interesting)

ChrisMaple (607946) | more than 7 years ago | (#17786064)

Re:Diamonds are next.... (1)

Bender_ (179208) | more than 7 years ago | (#17786408)


Nice, I was not aware of the later work. It is still a far way towards proper junctions.

Re:Diamonds are next.... (0)

Anonymous Coward | more than 7 years ago | (#17786344)

I guess they have to create a decent-sized diamond boule to cut wafers from (and act as a seed for growing more), first. Not sure if they've done that yet.

Re:Diamonds are next.... (1, Funny)

Anonymous Coward | more than 7 years ago | (#17787434)

With all the "quality time" I spend with my computer, I'm surprised I haven't given it a diamond already.

Excellent! (0)

Anonymous Coward | more than 7 years ago | (#17785752)

I was thinking recently that I should profile and optimize some of the software I maintain, but it sure looks like I won't need to.

SIlicon was here (1)

Kalle Barfot (147248) | more than 7 years ago | (#17785964)

Welcome to Hafnium Valley

Re:Silicon was here (1)

Kalle Barfot (147248) | more than 7 years ago | (#17786002)

Hafnium comes from the Latin Hafnia for "Copenhagen", home town of Niels Bohr.

So, welcome to Hafnia Vallis!

If you're into investing ... (1)

ScrewMaster (602015) | more than 7 years ago | (#17786382)

this might be a good time to put some money into your local Hafnium mine.

In Silico? (1)

Kensai7 (1005287) | more than 7 years ago | (#17786434)

Hmm, I don't know if you have noticed, but the old expression in silico will now have to be dropped...

In ferro perhaps!

Re:In Silico? (0)

Anonymous Coward | more than 7 years ago | (#17788532)

No.

All the chips have silicon substrates still. This is just the transistor gates.

It's Da Bomb! (1)

Easy2RememberNick (179395) | more than 7 years ago | (#17787258)

Keep it away from stray neutrons! (someone had to say it)

Finally... (2, Interesting)

IorDMUX (870522) | more than 7 years ago | (#17787848)

Well, it's about time. Hafnium oxide dielectrics were the talk of the semiconductor research world in the early/mid 90's. Big-time chip manufacturers refused to adopt the technology, though, hoping that some technology that didn't require the re-vamping of an entire fabrication facility would come along and magically reduce gate oxide lekage current.

The technology is fairly mature by now (from a research standpoint), so the only "news" is that the major manufacturers have finally realized that it is the least of all evils from a commercial point of view.

Personally, I wonder how different the current market would be if one of the commercial fab plants would have embraced the technology 5-10 years ago.

Cool - but internet is still waaaaaaaaay slow (0)

Anonymous Coward | more than 7 years ago | (#17788186)

what's the point to run at gazillions GHZ speed if the internet is stillll
sloooooooooooooooooooooowww ?

Need Compelling Applications for these chips (1)

CalcuttaWala (765227) | more than 7 years ago | (#17789374)

While advances in chip technology is indeed good news, this needs to be backed up by equivalent advances in new age applications. After all who would want more firepower behind the same old MS-Office or chat client.

My take is that the immense number crunching power of these new age chips should be directed towards a new generation of data compression/de-compression applications based on newer algorithms. This will allow intense video/grahpics based applications like Metaverse/SecondLife to run elegantly and effortlessly on existing networks.

Other applications could be language translation on-the-fly where you speak English at one end of telephone line and the listener gets to hear French at the other end.

We need creative applications like these to leverage the computing power of these new chips

Re:Need Compelling Applications for these chips (0)

Anonymous Coward | more than 7 years ago | (#17790268)

Look up Ray Kurzweil. One of his startups is making such a device/program. He has a video of it working in real time, but its mostly for voice recognition.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>