Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia Firmly Denies Plans To Build a CPU

timothy posted more than 6 years ago | from the this-time-we-mean-it dept.

Graphics 123

Barence writes "A senior vice president of Nvidia has denied rumours that the company is planning an entry into the x86 CPU market. Speaking to PC Pro, Chris Malachowsky, another co-founder and senior vice president, was unequivocal. 'That's not our business,' he insisted. 'It's not our business to build a CPU. We're a visual computing company, and I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused.' He also pointed out that such a move would expose the company to fierce competition. 'Are we likely to build a CPU and take out Intel?' he asked. 'I don't think so, given their thirty-year head start and billions and billions of dollars invested in it. I think staying focused is our best strategy.' He was also dismissive of the threat from Intel's Larrabee architecture, following Nvidia's chief architect calling it a 'GPU from 2006' at the weekend."

Sorry! There are no comments related to the filter you selected.

first (-1, Troll)

Anonymous Coward | more than 6 years ago | (#24765319)

penis

firsty first (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#24765321)

Why x86? Why not x64? Talk about a chip from 1986!

Inaccurate headline (5, Informative)

TheRaven64 (641858) | more than 6 years ago | (#24765331)

nVidia are building a CPU, a Cortex A9 derivative with a GPU on-die and a load of other nice features. The summary states that they're not building an x86 CPU, but this is not what the headline says.

x86 rumors origin ? (3, Interesting)

DrYak (748999) | more than 6 years ago | (#24765617)

Currently nVidia is partnering with VIA for small form factor x86 boxes. And they have made several presentation about a combination of (VIA's) x86-64 Issaiah and (their own) embed GeForce.
Touting that the platform would be the first small form factor able to sustain Vista in all DX10 and full Aero glory.

Maybe that is where some journalist got mixed and where all this "nVidia is preparing a x86 chip" rumor began ?

Re:x86 rumors origin ? (4, Insightful)

CodeBuster (516420) | more than 6 years ago | (#24767169)

Maybe that is where some journalist got mixed and where all this "nVidia is preparing a x86 chip" rumor began?

This is what happens when technical information is filtered through the brain of a salesperson, manager, or executive. It comes out completely mangled on the opposite side or, even worse, it morphs into something which while technically correct is NOT the information that the non-technical person thought they were conveying (i.e. they have unknowingly modified the requirements specification in a way that is logically consistent from a technical standpoint, but will result in the wrong product being built).

Re:x86 rumors origin ? (0)

Anonymous Coward | more than 6 years ago | (#24771009)

salesperson, managers, executives I can understand, but what about slashdot summaries and responses w/o even bothering to RTFA?

Re:Inaccurate headline (1)

LWATCDR (28044) | more than 6 years ago | (#24765633)

Exactly. It is a lot easier to go into the mobile space than X86.

Anyone Surprised? (5, Interesting)

Underfoot (1344699) | more than 6 years ago | (#24765335)

Is anyone actually surprised that the CEO is denying this? Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).

Re:Anyone Surprised? (2, Insightful)

Van Cutter Romney (973766) | more than 6 years ago | (#24765605)

Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).

I don't get the legal action part. Is the x86 architecture patented by Intel? Even if it is, wouldn't the patent have expired by now? After all, its more than 30 years old. Do AMD, VIA etc. pay licensing fees to Intel for building processors using the x86 architecture? If so, why cant NVidia?

Re:Anyone Surprised? (1)

Underfoot (1344699) | more than 6 years ago | (#24765683)

The /. article on the rumor goes into that quite a bit. http://hardware.slashdot.org/article.pl?sid=08/08/20/1917239/ [slashdot.org]

Re:Anyone Surprised? (-1, Troll)

Anonymous Coward | more than 6 years ago | (#24766121)

Your link didn't work, but I would suggest this article: http://hardware.slashdot.org/article.pl?sid=08/08/20/2137439/ [slashdot.org]

Re:Anyone Surprised? (4, Informative)

Daengbo (523424) | more than 6 years ago | (#24766739)

Check the URL before clicking.

Re:Anyone Surprised? (5, Informative)

morgan_greywolf (835522) | more than 6 years ago | (#24765801)

I don't get the legal action part. Is the x86 architecture patented by Intel? Even if it is, wouldn't the patent have expired by now? After all, its more than 30 years old. Do AMD, VIA etc. pay licensing fees to Intel for building processors using the x86 architecture? If so, why cant NVidia?

Yes. Various pieces of parts of the x86 architecture that have been developed within the last 20 years (noteably, stuff related to the IA32 architecture of the 386, 486 and Pentium and later lines) are all still under patent.

Patents filed before June 8, 1995 get the greater of 17 past the patent grant date or 20 years total, whichever is greater.

Re:Anyone Surprised? (2, Interesting)

Hal_Porter (817932) | more than 6 years ago | (#24765747)

Is anyone actually surprised that the CEO is denying this? Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).

The original story came from Charlie at The Inquirer. Charlie and NVidia hate each other.

Re:Anyone Surprised? (2, Interesting)

morgan_greywolf (835522) | more than 6 years ago | (#24765949)

The original story came from Charlie at The Inquirer. Charlie and NVidia hate each other.

Possibly related to Charlie's vast holdings of AMD stock...

Re:Anyone Surprised? (1)

Hal_Porter (817932) | more than 6 years ago | (#24766171)

I heard it was some sort of falling out between Charlie and NVidia over some issue I don't know which has turned into a long running feud. He writes stuff to piss them off, they try to cut off his information about them. There is a cycle.

So I don't believe a word he says about NVidia any more.

Re:Anyone Surprised? (1)

Gromius (677157) | more than 6 years ago | (#24769543)

I thought the Inquirer hated everybody and mostly runs sensationalist news stories that turn out to be a bit iffy in the end. To me they have less credibility than some random guy's blog.

In the interests of fairness this is maybe because I don't regularly read them, only whats picked up by slashdot/other news stories which tend by their very nature to be sensationalist and often made up. Which is why I don't regularly read them :)

Re:Anyone Surprised? (5, Insightful)

AKAImBatman (238306) | more than 6 years ago | (#24765981)

Is anyone actually surprised that the CEO is denying this?

Not at all. As you say, he would have denied it even if NVidia WAS planning a CPU. What actually speaks volumes IMHO, is the vehemence with which he denied it. Any CEO who's cover-denying a market move is not going to close his own doors by stating that the company could never make it in that space. He would give far weaker reasons so that when the announcement comes the market will still react favorably to their new product.

In other words: stick a fork in it, because this bit of tabloid reporting is dead.

Re:Anyone Surprised? (1)

dnwq (910646) | more than 6 years ago | (#24766759)

And if what you say is true, any CEO who's intending to cover-deny would be just as vehement as NVidia's CEO now.

Otherwise we would be able to tell what he's doing, and he won't be able to deny anything, no?

Re:Anyone Surprised? (2, Insightful)

AKAImBatman (238306) | more than 6 years ago | (#24767625)

Otherwise we would be able to tell what he's doing, and he won't be able to deny anything, no?

No. Because any CEO who immediately kills the market he's about to enter with his own statements is a fool.

If you want to get into the market of competing with Intel, you don't say that you could never make a CPU as good as Intel can.

Re:Anyone Surprised? (0)

Anonymous Coward | more than 6 years ago | (#24766937)

No, but MMX,SSE, AMD64 and other extensions (maybe also other updates to the x86) probably have non-expired patents.

If they wanted x86, they would've bought VIA long time ago.

Reprogrammable GPU? (4, Interesting)

Wills (242929) | more than 6 years ago | (#24765361)

When hell freezes over, they could release a GPU where the instruction set is itself microprogrammable with open-source design, and then end users could decide whether they want to load the GPU's microcode with an x86 instruction set, a dsp set, or whatever.

Re:Reprogrammable GPU? (2, Insightful)

Fourier404 (1129107) | more than 6 years ago | (#24765469)

I would be very, very surprised if that was any cheaper than just buying 2, one manufactured as a GPU, the other as a CPU.

Re:Reprogrammable GPU? (2, Insightful)

Toffins (1069136) | more than 6 years ago | (#24765661)

Who said price is the most interesting issue? I'd definitely choose the versatility of an open-source microcode GPU that could be dynamically reprogrammed to have any of several different instruction sets. It would be significantly simpler than the hassle of designing with FPGAs because much of the infrastructure (floating point logic etc) would already be available hardcoded into the GPU's silicon.

Re:Reprogrammable GPU? (3, Funny)

Fizzl (209397) | more than 6 years ago | (#24768053)

And I want a microwave than can be customer bludgeoned into a bicycle. Where do you people get the idea that you can do hardware in software?

Re:Reprogrammable GPU? (0)

Anonymous Coward | more than 6 years ago | (#24768349)

Software radio...

Re:Reprogrammable GPU? (2, Interesting)

TheLink (130905) | more than 6 years ago | (#24768511)

But who really wants that sort of versatility- who wants so many different instruction sets? The compiler writers? I doubt more than a few people want that.

Would such a GPU be faster? It might be faster for some custom cases, but is it going to be faster at popular stuff than a GPU that's been optimized for popular stuff?

The speed nowadays is not so much because of the instruction set, it's the fancy stuff the instruction set _control_ e.g. FPU units, out of order execution, trace cache, branch prediction etc.

Just look at the P4, Opteron and Core 2. For the same instruction set you get rather different speeds.

Good luck allowing buffer size, branch prediction logic, etc to be changed in a programmable way, have it run faster AND not screw up.

The FPGA sort of stuff is for when you can't convince Intel, Nvidia etc to add the feature for you, because nobody else wants it but you.

Programmers who make Crysis, and programmers who make Unreal, tend to want similar stuff fast.

Maybe there might be some custom functions that are different for each popular software, that need to be sped up. But I don't see why you'd necessarily require a different instruction set just use those functions.

Re:Reprogrammable GPU? (3, Insightful)

Kjella (173770) | more than 6 years ago | (#24769051)

Who said price is the most interesting issue? I'd definitely choose the versatility of an open-source microcode GPU that could be dynamically reprogrammed to have any of several different instruction sets.

As long as they're Turing complete, any of them can in principle do anything. Yes, then at least to me it comes down to price - if it's cheaper to have a car, boat and plane than making a tranasformer that can do all three at it, suck at all three and cost a bajillion more I'll go for traditional chips, thank you very much.

Re:Reprogrammable GPU? (3, Insightful)

MarcQuadra (129430) | more than 6 years ago | (#24769507)

Transmeta tried that. It was slow, expensive, and inconsistent. Also, nobody ever used any other 'instruction sets' besides x86, mostly because that's the most-common-denominator in the computing world.

It sucks, it's not the -best- way to do it, but it's the way the market seems to favor. Just ask Apple, Sun, DEC, and HP.

Difficult (3, Informative)

DrYak (748999) | more than 6 years ago | (#24765551)

Microcode-upgrade are possible for CPU that have a huge big complex reprogrammable pipeline like the current top of the line CPUs, or CPU where the pipeline is handled in software (like the Transmeta chips).

GPU, on the other hand, have a very short and simplistic pipeline which is hard-fixed. They draw their tremendous performance, from the fact that this pipeline drives ultra-wide SIMD units which process a fuck-load of identical threads in parallel.

But there nothing much you could reprogramm currently. Most of the die is just huge cache, huge registry files, and a crazy amount of parallel floating point ADD/MUL blocks for the SIMD. The pipeline is completely lost amid the rest.
(Whereas on CPU, even if the cache dwarfs the other structure, there are quite complex logic blocks dedicated to instruction fetching and decoding).

Re:Difficult (2, Interesting)

Wills (242929) | more than 6 years ago | (#24765931)

I was aiming for the extreme reprogrammability and versatility that an open-source microcode CPU design with SIMD, RISC and CISC sections all on a single die. Sure, the trade off is that you don't get as much capability in each subsection (compared to the capabilities of a dedicated GPU, or a dedicated modern CPU) because the sub-sections all have to fit inside the same total area of silicon. But what you get instead is an open-source microcode CPU which has great versatility, without needing to go down the FPGA design route (even more versatile, but less simple to use).

Re:Difficult (3, Informative)

billcopc (196330) | more than 6 years ago | (#24767493)

Let me guess: you've never read anything about microprocessor engineering, have you ?

What you describe is what every non-engineer dreams of. You want a chip that any idiot can reprogram, without knowing the "less simple" ways of FPGAs. That's kind of like saying you want a car that gets 200 miles to the gallon, can park in a shoebox and carry 20 kids in the back seat - oh, and it drives itself automagically so your kids can take themselves to soccer practice without bugging you.

The reason why no one ever builds such monstrosities is because there is simply no point to it, when you can have purpose-built chips designed and fabbed for a fraction of the cost. People don't stop breathing just because their device needs 2 distinct chips instead of one jesus-truck.

Hey, quit the dissing and flamebait (1)

Wills (242929) | more than 6 years ago | (#24768341)

Let me guess: you've never read anything about microprocessor engineering, have you ?

Actually I do my own FPGA designs, and write microcode too. Where do you get that I "want a chip that any idiot can reprogram"? I don't. I want an open-source microcode chip on the market that I can reprogram. That's not something "every non-engineer dreams of." Purpose-built chips are fixed in purpose. I don't want that. I want versatility in a single chip. That's why I want an open-source microcode chip. I would use that in my own designs. Perhaps you've never done FPGA design? In my experience, doing a FPGA-based CPU is considerably more complex than writing microcode for an existing CPU.

Re:Difficult (2, Informative)

dreamchaser (49529) | more than 6 years ago | (#24767983)

What you are describing is a pipe dream. Even *if* they managed to do something like that, performance would be utter crap, die size would be huge, and the odds are it just plain would suck.

Re:Reprogrammable GPU? (5, Funny)

HerculesMO (693085) | more than 6 years ago | (#24765663)

If hell froze over they wouldn't have to worry about the cooling on their chips.

I guess that's a plus.

Focused (5, Insightful)

Akita24 (1080779) | more than 6 years ago | (#24765375)

Yeah, they've stayed focused on graphics chips, that's why there are so many motherboards with nVidia chip sets .. *sigh*

Re:Focused (1)

Anonymous Coward | more than 6 years ago | (#24765637)

Yes, no CPU for them. Just GPU as CPU motherboards and such.

Re:Focused (0)

Anonymous Coward | more than 6 years ago | (#24765969)

Well yeah, thats true.

You could give them some benefit of the doubt and assume they were working towards SLI for a long time, and Nforce 1-3 was getting their foot in the door.

Besides... AMD needed a half decent mobo chipset, nvidia delivered, nvidia sold more graphics cards. A worthwhile distraction.

Re:Focused (1)

frieko (855745) | more than 6 years ago | (#24765993)

Well, for quite a while an nForce chipset was the only (good) way to connect your Athlon to your GeForce. Can't sell a car if there's no roads.

Re:Focused (3, Informative)

microbrew_nj (764307) | more than 6 years ago | (#24766297)

I can think of a few good reasons for Nvidia to roll their own chipsets. SLI is one. The market for integrated motherboards (with their chipset) is another.

Re:Focused (0)

Anonymous Coward | more than 6 years ago | (#24766165)

The reason they went into motherboard chipsets was to allow SLI. In other words, they went into motherboard chipsets soley to bolster their graphics chips

Re:Focused (1)

Akita24 (1080779) | more than 6 years ago | (#24766287)

I never said they didn't have a goood/valid reason, in fact, I'm damn glad they did. However, they *have* focused on something else, even if the reason for it was forwarding their graphics agenda. :-)

Re:Focused (1)

Kjella (173770) | more than 6 years ago | (#24766499)

Yeah, they've stayed focused on graphics chips, that's why there are so many motherboards with nVidia chip sets .. *sigh*

Of course, if you want to deliver integrated chipsets, you know the other much higher volume market for graphics chips, then you have to be able to build the rest of that chip as well or it wouldn't be integrated. Seeing as how the graphics capability become more and more important while the other features seem quite stable, it's be much stranger for them *not* to be in that market IMO.

Re:Focused (1)

CodeBuster (516420) | more than 6 years ago | (#24767273)

The primary reasons why motherboards don't include as many nVidia chipsets (or any other good chipsets for that matter) as they might otherwise are (1) cost, (2) heat, and (3) space. The mainboard attempts to combine as many functions as are practical into the smallest and cheapest to manufacture area possible. Those who want the nVidia chipsets were always free to purchase the video card of their choice aftermarket and install that into the graphics slot on their motherboard. For everyone else (mostly consumers) who didn't want to pay $400+ for their motherboard there were the Intel onboard graphics controllers that delivered an acceptable non-gaming performance for most people and kept the motherboard at just over $100 or so instead of 4+ times that price for nVidia graphics that they might not need or want.

Focused, except for MID CPU and nForce (1)

OrangeTide (124937) | more than 6 years ago | (#24768023)

between nForce and their new ARM11 cpu. It's hard to take comments like "is that we've stayed focused." too seriously.

Only reason (2, Insightful)

Z00L00K (682162) | more than 6 years ago | (#24765393)

The only reasons that they may build a chip for x86 (64-bit or not) would be to either use it for a special application or as a proof of concept.

A GPU and a CPU are different, but it may be a way to test if a GPU architecture can be applied to a CPU with a classic instruction set. The next step is to sell the knowledge to the highest bidder.

To compete with Intel would just be futile.

Re:Only reason (1)

oldspewey (1303305) | more than 6 years ago | (#24765965)

To compete with Intel would just be futile.

Hopefully we won't be saying the same about AMD in another few years.

Re:Only reason (2, Interesting)

ratboy666 (104074) | more than 6 years ago | (#24768439)

How is a "GPU" different from a "CPU"? If you take them to be the SAME, you end up with Intels LARRABEE. If you take them as somehow DIFFERENT, you end up with nVidias proclamation.

If they are considered the SAME, but with different performance tunings, other applications begin to open up.

As an example: it is currently true that the "GPU" is given an exorbitant amount of resources to do one thing -- create visuals for games.

And that's it. It contains a significant amount of the system memory, and processing logic, and "locks it away". Which is very good if you are selling the graphics cards, but not ideal (at all) for the customer.

If the graphics card can be placed closer and more generally, the customer would win. EXCEPT -- for one problem (and, boy is it a doozy).

The nVidia is programmed with a specific higher-order assembly language, We rely solely on the hardware vendor for tools. I think that this is UNIQUE in the (mass-market) processor world. And this is why Intel, with an x86 compatible GPU is such a threat. Can anyone else produce an OpenGL shader compiler for the nVidia? Or, better yet, extend it to do NON-shader tasks. How about for the AMD? Yes, you CAN for Intel, and will, by design be able to (I would expect, even ENCOURAGED) for LARRABEE.

The idea is to extend the "NUMA" concept for memory to processors. Intel is doing it because others are already doing it - SUN with Niagra and Niagra 2 are providing an absolutely amazing proof of concept. (except with multi-core and FPU units).

Why would you BOTHER with a specific purpose GPU, if you could have a (possibly less performant) workable solution with more cores, AND be able to use them for other tasks?

Of course this is not particularly relevant to TODAYs applications. They are matched to current hardware. Now, I will bring up the L word - Linux. Linux is suited to a much wider degree of scaling (practically) and runs on ARM up to Z/Series. It also supports NON-x86 ISAs. Which would mean that a non-x86 version of this idea is probably supportable. But, it wouldn't run CURRENT software, and, I believe, would be a complete non-starter.

But, take this with a grain of salt -- I am obviously not a great predictor (otherwise I would already be retired).

Re:Only reason (3, Informative)

Lisandro (799651) | more than 6 years ago | (#24769231)

How is a "GPU" different from a "CPU"?

The GPU is a specialized (vector) processor, while the CPU is a general purpose one. What the GPU does, it does great. But its reach ends pretty much there.

The nVidia is programmed with a specific higher-order assembly language, We rely solely on the hardware vendor for tools. I think that this is UNIQUE in the (mass-market) processor world. And this is why Intel, with an x86 compatible GPU is such a threat.

You're confused. Intel is not working on a "x86 GPU". Intel is working on a new GPU design - the kicker being that this is a relatively high performance one, instead of the kind of GPUs they offered so far (feature packed, but lacking in performance). The x86 instruction set has nothing to do with it, and in fact, has nothing to do with GPU programming, which is a completely different beast.

Can anyone else produce an OpenGL shader compiler for the nVidia? Or, better yet, extend it to do NON-shader tasks. How about for the AMD?

If i'm no mistaken, nVidias CG compiler is now open sourced. So yes.

Re:Only reason (1)

ratboy666 (104074) | more than 6 years ago | (#24769933)

The GPU is (generally) a vector processor with VERY limited branching capability, and VERY limited data sourcing. But, these things can be "fixed".

Yes, Intel is working on an "x86 GPU".

"Larrabee can be considered a hybrid between a multi-core CPU and a GPU, and has similarities to both. Its coherent cache hierarchy and x86 architecture compatibility are CPU-like, while its wide SIMD vector units and texture sampling hardware are GPU-like." (from http://en.wikipedia.org/wiki/Larrabee_(GPU) [wikipedia.org] )

As to Cg being "open sourced"? Nope, it is available, but the ISA of the nVidia chip is still closed. If I *could* I would work a vector/flow backend compiler for (a subset of) scheme to support it. But I can't. I will be able to with LARRABEE. I would probably base a compiler on Marc Feeleys picoscheme work. But... well, I can't.

Confident ? (1)

jpbelang (79439) | more than 6 years ago | (#24765395)

He seems rather confident with a two year head start on a company that has "billions and billions of dollars."

Re:Confident ? (0)

Anonymous Coward | more than 6 years ago | (#24766031)

>> He seems rather confident with a two year head start on a company that has "billions and billions of dollars."

The alternative is to sit down on a rock and cry about it.

Just a thought... (5, Insightful)

darkvizier (703808) | more than 6 years ago | (#24765441)

If you're 30 years behind them in their market, and they're 2 years behind you in yours, maybe it's not wise to be "dismissive of the threat" ?

Re:Just a thought... (0)

neokushan (932374) | more than 6 years ago | (#24765623)

Good logic there and you make a valid point, but being perfectly honest, 2 years in the GPU industry is more like 5 years in the CPU industry.

Re:Just a thought... (1)

Colonel Korn (1258968) | more than 6 years ago | (#24766587)

Good logic there and you make a valid point, but being perfectly honest, 2 years in the GPU industry is more like 5 years in the CPU industry.

And Intel's currently more like 6 years behind NV/ATI. LRB may change that, but Intel shouldn't count its chickens before they're rendered. Even then, don't expect LRB to approach 2 year old NV/ATI performance at the same price or power draw point.

Re:Just a thought... (1)

Kjella (173770) | more than 6 years ago | (#24767115)

If you're 30 years behind them in their market, and they're 2 years behind you in yours, maybe it's not wise to be "dismissive of the threat" ?

You're comparing apples to oranges. nVidia has 13 years of experience in the market (NV1 - 1995) but it doesn't say anything about how fast someone else could catch up or how far they'd stay behind. Anyone could shave 20+ years off Intel's "head start" easily, it's the last few years to make a competitive product that are hard. nVidia could within a few years produce a processor some years behind Intel in technology, but it'd be marketwise dead on arrival. If Intel really is 3+ (you see any Larrabees this year?) years behind then it'll be a complete and utter flop, no closer than nVidia could have been to Intel if they tried. Not saying I believe either of them, but the nVidia message is fairly clear "It'd be as stupid for us to go after Intel as it is for Intel to go after us", pointing out how much experience they have in their respective markets.

Re:Just a thought... (1)

darkvizier (703808) | more than 6 years ago | (#24769577)

Yeah, I agree. His wording was a bit pretentious, but I expect both companies will be in the game for a long time yet.

Regardless though, our hardware is finally going parallel. From a programmer's point of view, I'm just very happy to see things like CUDA [wikipedia.org] emerging, which will make parallel programming a whole lot more feasible. I think we're going to see some really impressive things developed as a result of this.

Re:Just a thought... (1)

immcintosh (1089551) | more than 6 years ago | (#24769833)

Doesn't matter how far behind you are in their market if the only thing in question is your own. NVIDIA has consistently put out vastly superior graphics hardware than Intel.

Re:Just a thought... (1, Interesting)

Anonymous Coward | more than 6 years ago | (#24771637)

Some would say that the way we use devices is changing, that feature packed cell phones, UMPCs, and specialist devices like consoles, are beginning to dominate the home space. These platforms often dont use an x86 CPU. They use a RISC cpu like an arm or a freescale chip.
These people are significant rivals to intel.
The XBOX and the PS2 both have quazi CISC CPU chips in designed by IBM.

What I'm saying is that although Intel probably is now the dominant player in the x86 market, this is simply leading to a lot of player making solutions that beat them in a direction Intel has not been attempting.

It would make sense for NVIDIA, with its history of embedded chips, to be one of these. Low cost SoCs with CPU, GPU, and chipset all in one place for the thinclient/ultra low cost market perhaps?

Microsoft will crosscompile their OS the moment there is demand.

wouldn't this be a good thing? (1)

BlackSnake112 (912158) | more than 6 years ago | (#24765531)

If more companies entered the same market that would give us more choices and better prices. I say go for it Nvidia make a cpu and see how you do against Intel and AMD.

I really wish that we could have the same socket in the motherboard for a CPU from Intel, AMD, Nvidia, . That would rock and give a real head to head test of which CPU is best for what you are doing. Never happen, but it would be cool to see.

Re:wouldn't this be a good thing? (1)

gnick (1211984) | more than 6 years ago | (#24765689)

If more companies entered the same market that would give us more choices and better prices. I say go for it Nvidia make a cpu and see how you do against Intel and AMD.

No, I do not think that would be a good thing. The up-front R&D cost for making CPUs is huge. Fabricating them ain't cheap either. Sure, NVIDIA has a lot of talent and would have a big jump on the R&D. And they have fabrication facilities that could be retuned for CPUs instead of GPUs. But I think that the end result of NVIDIA attempting to compete with Intel/AMD on the x86 CPU front would be death or serious damage to NVIDIA and we'd lose competition on the graphics card market rather than gain competition on the CPU market.

Re:wouldn't this be a good thing? (3, Informative)

ThisNukes4u (752508) | more than 6 years ago | (#24766227)

Actually nvidia doesn't own any fabs, they contract out all their chips to TSMC, same as ati. Although now ati/amd are going to be making their fusion chips at TSMC, so they will definitely have the expertise to make x86 chips in the near future(TSMC will).

Re:wouldn't this be a good thing? (1)

gnick (1211984) | more than 6 years ago | (#24766397)

Absolutely correct. Perhaps I should have said 'access to fabrication facilities' or 'fabrication relationships'. The point is that they have no resource issues barring them from the game, just a lot of catch-up work, stiff competition, and the good sense to lack motivation.

'Decide what you're going to do and focus on doing it well' is a good business model and, whether you're an NVIDIA fan or not, that's certainly what they're trying. And, so far, it's working out a lot better for them than a lot of the folks that have tried to stay in the graphics card arena.

Re:wouldn't this be a good thing? (1)

mr_mischief (456295) | more than 6 years ago | (#24766319)

The real losers would be Via and AMD. If NVidia made a big entry into the x86/x86-64 space, they would take as much ore more market share from the smaller players as from Intel. NVidia would be poorly served by knocking Via out and especially by knocking AMD out. Even though those companies compete for graphics dollars, they give NVidia somewhere to put its graphics and chipsets other than on Intel-CPU boards.

Re:wouldn't this be a good thing? (1)

Eravnrekaree (467752) | more than 6 years ago | (#24767109)

there was such a socket for some time, the Socket 7, around the time of the AMD K6 generation. You could put most intel and amd cpus of the era into the same motherboard.

Re:wouldn't this be a good thing? (2)

MBGMorden (803437) | more than 6 years ago | (#24769117)

Not just Intel and AMD. There was a time when you could use an Intel, Amd, Cyrix, IDT, or a Rise (and I'd bet even a couple more) CPU all in the same motherboard. Back then I didn't even DREAM of building a machine with an Intel chip - Cyrix and AMD were less than half the cost (close to 1/3rd the cost in some areas). And when those costs were in the hundreds of dollars for entry level stuff (rather than the $35 that you can get a budget CPU for now), it really made a difference.

Of course, that was when the clone chip makers were really just going mainstream. These days the price of all the chips has come down, and Intel is much more competitive. You can thank AMD for that. If not for them we'd all still be paying through the nose for Intel.

Nvidia seems to not be going the route of offering an x86 chip, but I hope to goodness that AMD pulls through their current bit of trouble or another contender takes the reigns, as the market will revert back to the old status if no one does.

Re:wouldn't this be a good thing? (1)

petermgreen (876956) | more than 6 years ago | (#24770129)

Indeed and not just intel and amd either but cyrix and IDT as well. Then intel moved to slot 1 which iirc involved some propietry stuff that stopped anyone else using it. The competitors stayed on socket 7 for a while then AMD moved to slot A and the others either died out or moved to processors soldered directly to the motherboard.

Re:wouldn't this be a good thing? (1)

cnettel (836611) | more than 6 years ago | (#24770885)

Mostly correct, but I actually think that VIA was Socket 379 (PIII) compatible for a while, and also stayed on a similar bus even when Tualatin was all about obsolete.

Re:wouldn't this be a good thing? (1)

swordgeek (112599) | more than 6 years ago | (#24767873)

Nice idea, but no.

CPU manufacture has become the most expensive part of computing. The cost of designing, prototyping, and then fabricating CPUs is INSANE! Worse, the price grows fantastically as the trace-size shrinks. It's been suggested that one of the reasons Intel moved so aggressively from 65nm to 45nm is to push AMD to the sidelines.

nVidia is roughly five percent the size of Intel. Trying to enter a market outside of their core competence against a behemoth like that is suicide.

rumour machine (3, Insightful)

Anonymous Coward | more than 6 years ago | (#24765533)

rather handy that this rumour gives nvidia, a GPU company, the chance to point out how futile it would be for them to try and enter the CPU market... then point over to intel, a CPU company, trying to make a GPU...

What they need to do is (2, Insightful)

mandark1967 (630856) | more than 6 years ago | (#24765685)

Remove their heads from their collective rectum and correct the damn problems they have with their video cards and motherboard chipsets.

I've been a loyal nVidia customer since the good old days of the Diamond V550 TNT card through the 8800GTX but they have really hosed up lately.

My 780i board has major data coruption problems on the IDE channel and my laptop is one of the ones affected by their recall so I am not too pleased with their ability to execute lately...

nVidia is best in graphics (1)

ilovesymbian (1341639) | more than 6 years ago | (#24765733)

In my opinion, nVidia is the best in graphics and it should stay that way.

Trying to go "up the ladder" by building CPUs will hurt it and other companies in the long run. So far, they have co-existed in peace with one another. Its just the natural flow of things.

And why not? (4, Insightful)

geogob (569250) | more than 6 years ago | (#24765745)

I wouldn't mind seeing more players in the computer processor industry. The headlines really make it sound like it would be a bad thing. Maybe I'm getting the headlines wrong, but having Nvidia presenting new alternatives to a market almost exclusively owned by Intel and AMD would be interesting.

Re:And why not? (1)

Joseph Hayes (982018) | more than 6 years ago | (#24766305)

I completely agree... I generally do a major upgrade or new build every 2-3 years, and I was on a tight budget earlier this year when I performed the ritual. It was a nice moment as a consumer to be able to buy a comparable (for my needs) cpu from AMD for 2/3 the cost of the intel lineup. Sure, the Opteron X2 isn't gonna knock out a Core2Duo, but for my needs it was plenty, and considerably cheaper. It would be VERY nice to see a 3rd player in the game, especially if it was a company I trust as much as NVidia, I buy their gpu's exclusively, too bad. But at least they're focused!

From 2006 (4, Insightful)

Alioth (221270) | more than 6 years ago | (#24765883)

"A GPU from 2006" sounds a lot like famous last words.

I wonder if anyone at DEC made comments in a similar vein about Intel CPUs, when the Alpha was so far ahead of anything Intel was making? NVidia's architect should not underestimate Intel, if he does, he does it at his company's peril.

Re:From 2006 (3, Interesting)

Lumpy (12016) | more than 6 years ago | (#24766339)

The alpha failed because the motherboards were $1300.00 and the processors were $2600.00 nobody in their right mind bought the stuff when you could get Intel motherboards for $400 and processors for $800.00 (dual proc boards, high end processors)

DEC died because they could not scale up to what the intel side was doing. you had thousands of motherboards made per hour for Intel with maybe 4 a day for Alpha. It's game over at that point.

I loved the Alphas, I had a dual alpha motherboard running windows NT it rocked as a server.

Re:From 2006 (0)

Anonymous Coward | more than 6 years ago | (#24768571)

"DEC died because they could not scale up to what the intel side was doing. you had thousands of motherboards made per hour for Intel with maybe 4 a day for Alpha."

I suspect it was more like 400 Alpha boards per day, and DEC could have made as many as they could sell. Problem was, they could not scale DOWN to Intel's price point. Not without cannibalizing their main stream of revenue.

Marching hand-in-hand with Intel, Microsoft outlasted DEC on the software side for the same reason. NT may have been seriously inferior to VAX/VMS, but at a low enough price there is no comparison.

Pricey hardware had long been a cash cow for DEC. They totally underestimated the customer's appetite for cheapie computers. Remember, Intel was selling the 486 vs. the first generation Alphas. To use a gaming analogy, it would be like selling Atari Pong vs. the X-Box (at the same time).

I loved the Alphas as well, but I REALLY love the concept of a high-performance server that can go on the air with Linux and 1 TB of storage for under $2k.

Re:From 2006 (1)

cbreaker (561297) | more than 6 years ago | (#24769059)

The same can be said about Itanium. The original Itanium (and even the current ones) were so DAMNED expensive and they didn't offer any real performance increase.

What really killed Itanium was AMD's x64 extensions.

Itanium will be around for awhile but it will never become commonplace outside of high end, massively SMP UNIX servers.

Re:From 2006 (1)

ishobo (160209) | more than 6 years ago | (#24771035)

DEC died because they could not scale up to what the intel side was doing. you had thousands of motherboards made per hour for Intel with maybe 4 a day for Alpha. It's game over at that point.

You clearly do not understand the high end market. You cannot compare low end servers with P2 chips and systems based around Alpha (or Power, PA, etc). Alpha died because DEC was sold to Compaq (an Intel partner). Prior to the sale, Alpha systems were doing brisk business. This was 1998 folks. The P3 would not be rleased until the following year and Itanium would not see the light of day until 2001.

Re:From 2006 (1)

schwaang (667808) | more than 6 years ago | (#24768247)

It's just the time-honored sports tradition of trash-talking your opponent. One example was when DEC's CEO Ken Olsen [wikipedia.org] famously said that "Unix is snake-oil".

That's just hilarious Ken, ya Fred Thomson ugly dinosaur-scaly bastard, since a few years later I bought a DEC Alpha from you running Ultrix instead of VMS.

Re:From 2006 (1)

cbreaker (561297) | more than 6 years ago | (#24768985)

Yea, but think about it: A good GPU from 2006 is still PRETTY DAMNED GOOD!

I'm still using an AGP 6800GT in one of my machines, and it's still trucking. I can't run everything at high quality but it's usable.

Yesterday, Intel made a GPU as good as a GPU from 2002. Today it's 2006. Tomorrow they might be competitive. And honestly, with Intel GPU specs being FAR more open than nVidia or ATI, I welcome it. We might actually be able to get GOOD graphics, completely open sourced drivers, on Linux.

Re:From 2006 (0)

Anonymous Coward | more than 6 years ago | (#24770139)

Fat chance getting GOOD open-source drivers in a timely fashion (as in, before the hardware is 2-3 generations behind), unless Intel writes them.

Moaning about closed-source drivers is one of the lamest trolling techniques around.

How about Transmeta style technology? (1, Interesting)

Anonymous Coward | more than 6 years ago | (#24765967)

Rewrite the software in place to run on a different architecture (whatever their latest GPUs implement). Maybe, just maybe GPUs have evolved to a point where interpreted generic-x86 wouldn't be (completely) horrible.

Re:How about Transmeta style technology? (1)

Eravnrekaree (467752) | more than 6 years ago | (#24767279)

that would be interesting if you turn a GPU into a general purpose CPU. That way they would have a CPU without having to invest much additional resources into developing it, using the same core for both. But I have no idea if that is possible. It is likely that the GPU actually has less processing power than a current CPU, so it might not be nearly as fast as regular CPUs. It could work for a low end market or embedded. The ISA though may be designed around 3D graphics operations and perhaps you wouldnt have the needed arithmatic and logic needed, but there could be a way to use graphics operations operations to do general calculations, who knows.

NVidia's Architect (1, Funny)

Anonymous Coward | more than 6 years ago | (#24766079)

http://www.hackthematrix.org/matrix/pics/m3/arch/1.gif

How nVidia "Survived" (5, Insightful)

Bruce Perens (3872) | more than 6 years ago | (#24766263)

I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused.

3DFx was the first company to publish Open Source 3D drivers for their 3D cards. nVidia sued them, then bought them at a discount, and shut down the operation. So, we had no Open Source 3D for another 5 years.

That's not "staying focused". It's being a predator.

Bruce

Re:How nVidia "Survived" (5, Insightful)

Rufus211 (221883) | more than 6 years ago | (#24766683)

What on earth are you talking about? 3DFx died because it was horribly mismanaged and ran out of money. There were lawsuits, but 3dfx sued NV first in 1998 and then in 2000 NV counter-sued (source [bluesnews.com] ). True NV's countersuit was right before 3dfx died, but a simple lawsuit that's gone nowhere in the courts yet doesn't cause a company to go bankrupt overnight.

Personally I'll believe one of my (ex-3dfx Austin) friend's explanation for their downfall: the fully stocked Tequila bar that was free to all employees. Or there's a whole list of problems leading to their decline on wikipedia [wikipedia.org] .

Re:How nVidia "Survived" (1)

Bruce Perens (3872) | more than 6 years ago | (#24766967)

Pixar has had a great many employee perks, starting with cohabitant insurance benefits long before they were profitable. It's not very well known that they went bankrupt, repurchased employee stock, and refinanced once, although with Steve Jobs as the only major creditor they didn't need to go through formal bankruptcy in court.

They asked a lot of employees, and the benefits had to match that.

I think nVidia's lawsuit was strategicaly positioned to be the straw that closed out additional investment prospects for 3DFx and pushed them into formal bankruptcy.

I am mostly concerned with this because that was our only source of 3D cards with Open Source drivers, and nVidia killed it, and we really only recovered from that over the past year or so with Intel and ATI's releases.

Bruce

Re:How nVidia "Survived" (0)

Anonymous Coward | more than 6 years ago | (#24769843)

And yet somehow Pixar managed to make movies during the interim. Sounds like these open source drivers were non-essential to Pixar's business, and the notion of "recovering" from a lack of them is just posturing.

Re:How nVidia "Survived" (1)

rtechie (244489) | more than 6 years ago | (#24768585)

3DFx died because NVIDIA crushed them with the GeForce. 3Dfx had already released a very disappointing product in the Banshee (it was buggy and slower that the Voodoo 2 SLI that proceeded it). Hardware T&L, controversial at the time, proved to be a killer feature.

Re:How nVidia "Survived" (0)

Anonymous Coward | more than 6 years ago | (#24766687)

I don't think I follow your logic. What does one thing have to do with the other? They were able to outmaneuver and overtake 3dfx. How does that negate the claim of succeeding by staying focused?

Re:How nVidia "Survived" (0)

Anonymous Coward | more than 6 years ago | (#24766891)

You mean it had nothing to with this [photobucket.com] ?

Re:How nVidia "Survived" (5, Interesting)

alen (225700) | more than 6 years ago | (#24766955)

3dfx's problem was they could never figure out how they sold their cards. they flipped flopped from themselves to having others make the cards like Nvidia does. after so many times no one wants anything to do with you because it's bad for business planning.

nvidia has had it's current selling model for 10 years and only its partners have changed. if you want to sell video cards you can trust that if you sell cards based on nvidia's chips they won't pull the rug out from under you next year and decide to sell the cards themselves

Re:How nVidia "Survived" (3, Interesting)

Bruce Perens (3872) | more than 6 years ago | (#24767263)

Pixar had an OEM model too, back in its days of making hardware and software products (the Pixar image computer, Renderman, Renderman hardware acceleration) while waiting for the noncompete with Lucasfilm to run out. It's a very difficult way to run a business, because you have to pull your own market along with you, and you can't control them.

It does look like 3DFx bought the wrong card vendor. They also spun off Quantum3D, then a card vendor, which is still operating in the simulation business.

Re:How nVidia "Survived" (0)

Anonymous Coward | more than 6 years ago | (#24767663)

You mean the 3dfx that pursued a developer for creating a Glide implementation for Direct3D? The one who couldn't compete on image quality or price? Those poor heroes.

That's not "staying focused".It's being a predator (1)

bagofbeans (567926) | more than 6 years ago | (#24771165)

Actually, it's both. Bruce - you just don't like predatory behaviour, and I don't either. Removing competition is a common tool to relax a rapid and expensive development pace.

intel is a process company (2, Insightful)

Steveftoth (78419) | more than 6 years ago | (#24769267)

They are very good at doing research in making their chips very cheap to make and own the whole stack of production from start to finish. This is how they have managed to make it despite many many misteps along the way.

nVidia doesn't own the factories that they use to make their chips, they just design them and use factories like TSMC. nVidia would be stupid to compete with intel in the same space (x86 CPUs) until they own and can efficiently build chips like intel can.

AMD was the only ones doing it as they tried their best to own all their own fabs, however they are running in the red and are trying to sell some of them now. We'll see if they can pull it together but still they are one of the only other companies out there that actually tries to build the chips from start to finish.

Intel's latest graphics offering is going to fail, not because they don't have the hardware (actually their new larabee looks really fast). but because their graphics drivers have always stunk and there is little evidence to suggest that they will be able to make a leap forward in graphics driver quality that will make their solution better then AMD or nVidia. They have to write full DX9, DX10, and OpenGL drivers to really compete with nVidia, then they have to optimize all those drivers for all the popular games (cause nobody will re-write Doom, HL, UT, FarCry, etc.. just for this new graphics card).

It could happen, but will it?

I do hope that larabee turns out to be an awesome coprocessor for other tasks. We'l just have to see if people actually port their code to it.

of course they deny (1)

po134 (1324751) | more than 6 years ago | (#24769903)

Nvidia has denied rumours that the company is planning an entry into the x86 CPU market

Of course they're denied building a x86 CPU, they're working on an x64 model. 'nuff said.

Nvidia has denied... not really. (2, Interesting)

bagofbeans (567926) | more than 6 years ago | (#24771069)

I don't see an unequivocal denial in the quotes. Just an implied no, and then answering a question with a question. If I was defining products at Nvidia, I would propose an updated Via C7 (CPU+GPU) product anyway, not a simple standalone CPU.

"That's not our business. It's not our business to build a CPU. We're a visual computing company, and I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused."

"Are we likely to build a CPU and take out Intel?"
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?