Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA To Exit Chipset Business

kdawson posted more than 4 years ago | from the told-you-so dept.

AMD 185

The rumor that we discussed a few months back is looking more real. Vigile writes "Once the darling of the enthusiast chipset market, NVIDIA has apparently decided to quit development of future chipsets for all platforms. This 'state of NVIDIA' editorial at PC Perspective first highlighted the fact that the company was backing away from its plans to develop a DMI-based chipset for Intel's Lynnfield processors due to legal pressure from Intel and debates over licensing restrictions. That effectively left NVIDIA out in the cold in terms of high-end chipsets, but even more interesting is the later revelation that NVIDIA has only one remaining chipset product to release, what we know as ION 2, and that it was mainly built for Apple's upcoming products. NVIDIA still plans to sell its current offerings, like MCP61 for AMD platforms and current generation ION for netbooks and nettops, but will focus solely on discrete graphics options after this final release."

cancel ×

185 comments

Sorry! There are no comments related to the filter you selected.

Intel? (4, Insightful)

_PimpDaddy7_ (415866) | more than 4 years ago | (#29694595)

Do we get mad at Intel?

This is a sad day.

Competition is good, I'm sorry.

WebGL (5, Insightful)

tepples (727027) | more than 4 years ago | (#29694657)

Do we get mad at Intel?

Yes. Intel hasn't produced a competitive GPU for its integrated graphics. This will become painfully apparent once web sites start to use JavaScript bindings for OpenGL ES [khronos.org] .

Re:WebGL (4, Insightful)

Anonymous Coward | more than 4 years ago | (#29696453)

I'm not looking forward to that day. Everything done with JavaScript so far has sucked filthy penises.

Take the stupid comment slider here at Slashdot, for example. The old non-AJAX approach worked just fine. You didn't have to click "More" and then wait, click "More" and then wait, etc. hundreds of times just to see all of the comments.

And you could view the -1 comments easier, as well. Even now I still don't know how to show the hidden comments. The piece of shit sidebar panel says "# Hidden", but I pull on the dragger thing and it refuses to move! The other one works fine, though.

I see the equation as being:
Idiot Web Developers + JavaScript + OpenGL ES = Totally Fucking Horrible Web Sites Which Make Me Want to Cry

Re:WebGL (1)

tepples (727027) | more than 4 years ago | (#29696923)

Idiot Web Developers + JavaScript + OpenGL ES = Totally Fucking Horrible Web Sites Which Make Me Want to Cry

Not all web developers are idiots. What happens when Google's non-idiot developers start doing amazing things with WebGL?

Re:WebGL (1)

Jaysyn (203771) | more than 4 years ago | (#29696455)

My laptop with it's built in ATI PCIe chip recently died & I had to swap the drive out to another Dell laptop I had laying around, one whose HD died on a friend and said friend basically didn't want anymore . It had one of those Intel IGP chips in it. I was pleasantly surprised when it would still play NWN & Dungeon Keeper II. I was freaking shocked that it played DDO & both ran faster & looked even better than the ATI PCIe chip did.

Re:Intel? Probably Not. (4, Insightful)

mpapet (761907) | more than 4 years ago | (#29694805)

I would argue Intel's strength relies a little on the U.S. intellectual property laws and procedures. If the country loosened intellectual property law, Nvidia might have a chance in hell.

But this is also about a global market where 80% of product comes from maybe 10% of all possible manufacturers and there are few laws preventing Intel from doing all kinds of market shenanigans in places like China.

I know the loosening of intellectual property laws would help Nvidia's case, but I don't think it would bring about a semi-competitive marketplace because this market (global OEM) has few legal constraints.

Re:Intel? Probably Not. (3, Interesting)

rgviza (1303161) | more than 4 years ago | (#29696425)

Then there's also the whole thing of nVidia producing utter crap chipsets... That might have a teeny weeny little something to do with it.

It has nothing to do with intel's "market dominance" and everything to do with nVidia's inability to be competitive in a market segment they know little about, and the shoddy crap they try to pass off as a chipset. Once you put the koolaid down and have an objective look at their product, it simply sucks.

I've had 3 of them and all three were utter garbage. DFI, Gigabyte, ASUS, it didn't matter. Every time it turned out to be the MCP in the chipset or some other part of it failing or not working correctly to begin with. In one case the interrupt controller didn't work at all with a dual core CPU and on both linux and MS Windows they had to put a "software" interrupt controller in the kernel to make it work with a dual core cpu. As you might guess this made the multi cpu performance _worse_ than a single cpu. And this was a chipset designed for multi-core CPUs.

I've subsequently had 2 AMD crossfire chipsets, both worked perfectly. nVidia chipsets are 0-3 in my book.

Good riddance...

That's hundreds of thousands of consumers that won't get burned. Intel or AMD chipsets for the win...

Re:Intel? (5, Interesting)

linhares (1241614) | more than 4 years ago | (#29694845)

I think this is a clever ploy to make Intel play nice with Nvidia. By "letting go" of the market, true or not, Nvidia sends a message that Intel is a monopoly, which puts Intel in a much worse position (remember the EU) than Intel has when competing with Nvidia in the chipset scenario. Obviously, it's impossible to know what's going to happen. But if I were at the top @ Intel, I'd be freaking out a little, for this tiny little company "we have crushed" (that's what Nvidia makes it look like) will get us into the spotlight from regulators. I'm gonna go get some popcorn.

Re:Intel? (1)

rcamans (252182) | more than 4 years ago | (#29696531)

What about AMD, VIA, SiS, ATI, and ALI?

Re:Intel? (1)

John Betonschaar (178617) | more than 4 years ago | (#29696819)

AMD = ATI
VIA = mostly down the small form factor road (to complement their Epia and Nano CPU's) these days
SiS = are they even in the chipset business anymore? I can't remember seeing a motherboard with SiS chipset the last few years

So it seems AMD/ATI is the only 'real' competitor left

Re:Intel? (2, Informative)

Kamokazi (1080091) | more than 4 years ago | (#29696937)

AMD doesn't make Intel chipsets, VIA's modern offerings are laughable (hit up newegg, ONE LGA775 VIA mobo), SiS seems to have gone OEM only and I haven't seen one of their chipsets in ages, so presumably the marketshare is miniscule. ATI stopped making new Intel chipsets when they got bought out (still have one old one available), and ALI seems to have dropped out of the chipset business years ago.

Re:Intel? (3, Insightful)

noundi (1044080) | more than 4 years ago | (#29695067)

Do we get mad at Intel?

This is a sad day.

Competition is good, I'm sorry.

This is competition. Just not one of the occasions that you like.

Not Intel (5, Insightful)

Groo Wanderer (180806) | more than 4 years ago | (#29695267)

"Do we get mad at Intel?"

Yeah, they made Nvidia look bad by putting out chipset that met spec, survived average use, then had the gall to not hide the fact! (see http://support.apple.com/kb/TS2377 [apple.com] ) I mean really, how can Intel do business like that? And people wonder why Nvidia is bailing, then trying to hide it before Wall Street notices and downgrades them more.

The story goes like this.
1) Nvidia stops designing future chipsets
2) Nvidia blames Intel for nebulous atrocity
3) Nvidia hides the facts
4) It gets out
5) Nvidia admits it
6) Wall Street notices (several analyst reports out on the subject today)
7) Nvidia realizes that Wall Street noticed
8) Nvidia backpedals, hard, fast, and with all due slime

The 'denial' they are throwing around now states that they are not going to develop AMD chipsets anymore, not going to develop Intel chipsets anymore, and only going to continue selling the ones they have made. Until Intel stops making FSB chips in a few months, then it WILL be Intel's fault somehow.

Back to the original question, can you explain how Nvidia voluntarily stopping design of AMD chipsets is Intel's fault? :)

I saw this a year ago when I saw them stop most if not all future chipset products. I wrote it up. Nvidia denied it. A year later, they announce a stoppage for a few hours until the implications sink in. Then they deny it.

Yup. Intel. Those bastards!

I agree about the competition part, but this isn't sad, it was planned.

                    -Charlie

Re:Not Intel (4, Informative)

ByOhTek (1181381) | more than 4 years ago | (#29695743)

That's GPUs, not mobo chipsets.

Any pretty much every manufacturer has had screwups. That being said, nVidia has made some nice performance chipsets in the past, and it's a shame to see them go. Really, for my experience, and in terms of reliability, they are have been the only company to produce chipsets that could compete with Intel.

Re:Not Intel (2, Informative)

Groo Wanderer (180806) | more than 4 years ago | (#29696041)

Yup, you are right, but the same thing happened with their chipsets, same problem. Look up the recent Sony admission on the same topic, and Dell, HP along with many others. I won't keep spamming my own links/stories here, you can find them an a lot more with a little searching.

I would not say their chipsets are reliable, nor bug free, but they did have speed at times. This may be OK for a home user, but looking at the data corruption problems for their RAID setups, drive controller issues in general, networking features that never functioned right, and others, for any real use, people avoided them. The only reason they looked good is that up until recently, the competition, ATI, VIA and Broadcom, was far far worse.

Intel was almost always more reliable, more stable, and less bug ridden. ATI cleaned up it's act with the release of the 6xx series chipset, and has been moving steadily upward since. The others went away.

If you want a good example of Nvidia reliability, go get the NUFI lawsuit against them, it details 10 (From memory, I might be off) of the chips that died, the companies affected, and Nvidia's claims (financial) about them. Mike Magee did that one on TGDaily, I went into it a bit more on The Inq, but the lawsuit is well worth reading if you think NV can produce a reliable chip. It isn't fanboi ranting, it is legal filings. If you can't find it, email me at semiaccurate.com and I will send the PDF. Make sure your mail can take a big attachment though, I think it was in the 5-10MB range.

                    -Charlie

Re:Not Intel (0)

Anonymous Coward | more than 4 years ago | (#29696805)

Is that you again Charlie, the NVIDIA basher?

Re:Intel? (1)

ByOhTek (1181381) | more than 4 years ago | (#29695685)

Mad is an understatement.

This article needs a GODDAMNIT tag...

Re:Intel? (1)

Hurricane78 (562437) | more than 4 years ago | (#29695961)

No. More like their chipsets being utter crap for some years now. Always hailed as the greatest in the tests, but when you actually buy them, you notice weird things. Like the main bus not being big enough, so that a average raid0 can make professional sound cards crackle beyond usability. or like the builtin NIC being so bad, that you actually have to buy another one and disable the on-board one in the bios to avoid it crashing your OS on the first transferring packet. Things like that.

I would have never ever ever bought another nVidia-based mainboard in my life.

I think it's quite good for them to concentrate more on their graphics cards. Since AMD/ATi now is a pretty strong competitor. And if intel finally enters the market for gaming cards, they will have a hard wind blowing in their faces. It'd be sad to see nVidia go out of the graphics business too, as they are doing quite well compared to ATi in terms of quality. (Ask the Demo scene, every Linux user who tried them, Carmack or any other game developer out there. ^^)

Re:Intel? (1)

obarthelemy (160321) | more than 4 years ago | (#29696265)

True for me. The last nVidia chipset I was happy with was the nForce 1: perfect reliability, outstanding sound. Since then I've always had small problems, like RAM sticks working everywhere except with their chipsets, heavy HD loads causing OS crashes, heavy USB loads causing OS unresponsiveness...

I had blacklisted nVidia chipsets years ago, I personally won't miss them. A mono/duopoly isn't ever good, though.

Re:Intel? (1)

rcolbert (1631881) | more than 4 years ago | (#29696951)

Agreed with most of the sentiments thus far. I really like my Nvidia GPU's and Intel chipsets. Both have been reliable and solid for the past few generations. Do I like the reduction in competition in the marketplace? No, not really. But I do like it when companies focus on their core competencies. These are both highly specialized markets. If Nvidia produced a better chipset I'd be more upset.

Re:Intel? (0)

Anonymous Coward | more than 4 years ago | (#29696873)

Competition is good, I'm sorry.

Yes, it is.

But outside of graphics cards, nvidia sucks, especially when it comes to ahci/sata drivers.

Bad idea?? (1)

Absolut187 (816431) | more than 4 years ago | (#29694605)

If I understand this correctly, NVIDIA is getting out of the "integrated GPU" on motherboard business?

That seems like a really bad idea, given that the vast majority of PC users have no need for a dedicated graphics card.

Are the profit margins too slim on integrated graphics chips?
Or are they just tired of dealing with Intel's legal dept?

Re:Bad idea?? (2, Informative)

Anonymous Coward | more than 4 years ago | (#29694715)

No they are stopping production of their nForce line of motherboards.

Re:Bad idea?? (2, Informative)

Sandbags (964742) | more than 4 years ago | (#29695021)

ONLY for the new i5/i7 architecture and beyond...

Re:Bad idea?? (1)

commodore64_love (1445365) | more than 4 years ago | (#29695245)

>>>ONLY for the new i5/i7 architecture and beyond...

I for one welcome our new Intel overlords. Maybe Apple will get smart and switch to AMD-based macintoshes. Too bad the 68000 series no longer exists, so we could have some real alternatives.

Re:Bad idea?? (-1)

Anonymous Coward | more than 4 years ago | (#29695393)

Don't you mean "PowerPC series"?

Re:Bad idea?? (2, Interesting)

Sandbags (964742) | more than 4 years ago | (#29696371)

Well, as Apple made public knowledge when they switched to Intel, (not an exact quote) "we develop, compile, and test OS X on multiple hardware platfors, always have since the very first day of development, include new processor platforms as they come available, and can change to an alternate platform at any time."

IBM appears to be working on a low power P6/P7 architecture, AMD has some nice new stuff, They have their own fab now for low power CPUs, I'm sure they're compiling against Atom and likely even Cell...

Honetly, as long as GPUs remain seperate from CPUs, it's long past time when the north/southbridge became integrated into the core CPU silcon. They already added the memory controller and other mainboard resources, now the base systems bus and other common components could all be included. nVidia really is doing the right thing moving into alternate markets, this one IS dying, this may actually be good for both nVidia and intel as it gives intel an advantage in being able to seperate and move away from current trends easier, and gives nVidia a more consolodated and focussed research effoer for GPU/CPU acceleration - generic core processing technology.

nVidia will still reap a LOT of profit from the existing systems for years, and makes a killing in GPUs and AMD chipsets. Saving this reaserch money, shutting down the facilities, and in the end almost certainly winning a case against intel for a few hundred million in cash down the road, this is a great opportunity for them, and I commend their decision.

Re:Bad idea?? (2, Interesting)

MBGMorden (803437) | more than 4 years ago | (#29695653)

Which means what the GP said. Nvidia's integrated graphics solutions come in the form of Nvidia chipsets (of which the nForce is the most common). If they're no longer making chipsets, then they're no longer making integrated graphics. There's still the possibility of a maker taking a discrete chip and adding it separately to the motherboard PCB, but with virtually every modern northbridge chip having built in graphics already I don't see that happening. The people who are satisfied with integrated will use that, the people who want something better will want to do so via upgradeable addon cards.

Truthfully, I just don't see the wisdom in this decision. I'd have sooner expected Nvidia to announce that they were leaving the discrete graphics chip market rather than the chipset market.

Re:Bad idea?? (1)

obarthelemy (160321) | more than 4 years ago | (#29696339)

They are in a legal dispute with Intel and currently cannot produce chipsets for Intel's new CPUs.

They probably find that they cannot recoup the costs of developing an IGP chipset for just the AMD platform.

And in the quite short term (1yr), Video will move off the chipset and on to the CPU package, making IGP chipsets a dead-end.

Since the Video part has always been the strong point of nVidia's chipsets, they see no point in continuing in the chipset business with non-IGP parts. I understand why.

If I were them, I'd jump into the CPU business though, for fear of getting Matroxed into irrelevance.

Re:Bad idea?? (1)

MartinSchou (1360093) | more than 4 years ago | (#29694855)

Let me quote the fine summary:

[T]he company [is] backing away from its plans to develop a DMI-based chipset for Intel's Lynnfield processors due to legal pressure from Intel and debates over licensing restrictions.

I'll let you decide, which of these two questions that quote is relevant for:

Are the profit margins too slim on integrated graphics chips?
Or are they just tired of dealing with Intel's legal dept?

Re:Bad idea?? (1)

PhrostyMcByte (589271) | more than 4 years ago | (#29694895)

They are stopping their nForce line of chipsets (as in, northbridge/southbridge). I couldn't be the only one to see this coming a mile away, could I? Before AMD acquired ATI, they and Nvidia were perfect partners. After that they became a lot less relevant. With Intel and AMD producing their own well regarded "gamer-grade" products for some time now, I can see why Nvidia sees little point in fighting.

Re:Bad idea?? (1)

LWATCDR (28044) | more than 4 years ago | (#29694979)

All of the above plus Intel is going to put the GPU on the CPU soon.
Intel is going to kill the integrated graphics market with that move and AMD/ATI is planning on doing the same thing.
So since Intel's GPUs are terrible we will just have to wait and see what comes of this.
The big impact I see is on Apple. They are really tied to Intel but have been using nVidia GPUs .

Re:Bad idea?? (1)

mikael (484) | more than 4 years ago | (#29695135)

Intel's Larrabee multi-core CPU/GPU [wikipedia.org] should be interesting to see.

Re:Bad idea?? (0, Redundant)

LWATCDR (28044) | more than 4 years ago | (#29695191)

It isn't shipping so it isn't real yet.

Re:Bad idea?? (3, Interesting)

Kjella (173770) | more than 4 years ago | (#29695269)

Intel will be putting graphics on the CPU, according to their roadmap.
AMD will be putting graphics on the CPU, according to their roadmap.

At that point the GPU is already a "sunk cost", noone will buy an integrated GPU that's only slightly better than another integrated GPU. It's also not only legal reasons, but also about pricing, timing, access to resources and so on. Intel can increase license costs, do accounting so more profits go on processors, delay launches of competing chipsets, deny access to resources trying to work out incompatibilies or instabilities and so on. Intel is doing extremely well and is ready to do that landgrab, one way or the other. I think nVidia is doing a better play as the victim of Intel's legal department rather than being gently pushed out the door as the GPU joins the CPU.

Re:Bad idea?? (0)

Anonymous Coward | more than 4 years ago | (#29695941)

Except that Intel cannot build a decent GPU. Only a mediocre one.

It's because of latency (1, Funny)

pizzach (1011925) | more than 4 years ago | (#29695525)

I stopped using discrete modems and went for winmodems (softmodems) almost immediately because of the latency getting the data through the serial port [linmodems.org] . Sadly, it is the same for graphics cards which is why you will never catch me dead with one in my machine. I will pown (sic) you all everyday of the week.

What about their embedded procs? (1)

BadAnalogyGuy (945258) | more than 4 years ago | (#29694637)

Tegra, Tegra, wherefore art though Tegra?

Re:What about their embedded procs? (4, Funny)

Mortice (467747) | more than 4 years ago | (#29694673)

Tegra, Tegra, wherefore art though Tegra?

I'm not. Wherefore do you ask?

I stopped using nvidia motherboard (2, Interesting)

TheGratefulNet (143330) | more than 4 years ago | (#29694707)

due to many problems. reports of data corruption at the design level (not build or parts but *design* faults). their ethernet drivers were horribly reverse engineered and never came close to the stability of the eepro1000, for example. at least on linux.

there were issues with sata and compatibility.

in short, they were over their heads. glad they finally admitted it (sort of).

Re:I stopped using nvidia motherboard (0)

Anonymous Coward | more than 4 years ago | (#29694877)

Give credit where it is due. During the Athlon64 days (socket 939?), Nvidia were in a class of their own.
Unfortunately Nvidia have just been muscled out of their niche. If they had released a killer chipset at the right price, they could have lasted longer.

Re:I stopped using nvidia motherboard (2, Interesting)

RMingin (985478) | more than 4 years ago | (#29695439)

Give credit where it is due. During the Athlon64 days (socket 939?), Nvidia were in a class of their own.

They were only in a class of their own because no one else was attending that school. Via was always a joke, Nvidia just provided the punchline. AMD was pulling out of chipsets at that point, and Intel had no interest in chipsets for AMD CPUs. Who then now?

AMD bought ATI, between the two of them they manage to synthesize half a decent chipset, and et viola, Nvidia is irrelevant. Since no one on the Intel side ever had much love for NV, they managed to put THEMSELVES out in the cold.

nForce2 was the high point for Nvidia chipsets. Since then it has all been a slow decline.

Re:I stopped using nvidia motherboard (0)

Anonymous Coward | more than 4 years ago | (#29695453)

data corruption at the design level (not build or parts but *design* faults)

That's nothing of note. Every single chipset has various bugs and issues with buses. There are about 9001 ways to fuck up the PCIe bus, apparently.

Doesn't look good for Nvidia (4, Interesting)

PolarBearFire (1176791) | more than 4 years ago | (#29694763)

They better have a compelling product with the upcoming fermi then, but from I what I hear they're trying to design their GPUs for more general purpose computing, specifically scientific computations. It's a really big gamble and I can't see that it will be a huge market. Their upcoming products are supposed to have 3 billion transistors which is way more than 4x the amount in an i7 CPU. It's probably going to cost a ton too.

Re:Doesn't look good for Nvidia (1)

linhares (1241614) | more than 4 years ago | (#29694937)

It's probably going to cost a ton too.

Sure it will, but it's meant as a replacement for a clusterf*ck of metal that costs in the millions. If it can compete with small supercomputers, they have a good chance IMO. They're also attacking from below with Tegra, and with ChomeOS running on ARM, so I think Nvidia is a company to watch.

Re:Doesn't look good for Nvidia (1)

Svartalf (2997) | more than 4 years ago | (#29695419)

Ah...but NVidia said WinCE was the way to go on ARM...

Re:Doesn't look good for Nvidia (2, Interesting)

MBGMorden (803437) | more than 4 years ago | (#29695737)

You have to look at target market though. Sure they might be the deal of the century for that occasional scientist looking for supercomputer power on a budget, but in reality, few regular users - hell few extreme power users - need anything resembling a supercomputer (not just raw speed, but super computers are designed much more for parallel processing, and a ton of what users do is more suited to serialized processing).

Overall, I think they do indeed have a target market - I just don't see that target market being sufficient for them.

Re:Doesn't look good for Nvidia (1)

MartinSchou (1360093) | more than 4 years ago | (#29694991)

And yet the 3 billion transistors is only 50% more than the AMD Radeon HD5800 series.

Considering that they're adding general purpose functionality and direct C++ programming onto the chip, it might not be an entirely unreasonable result adding an extra billion transistors. But, time will tell.

Re:Doesn't look good for Nvidia (2, Interesting)

blackchiney (556583) | more than 4 years ago | (#29695027)

I don't see why they couldn't go for the GPGPU and scientific computation market. They've acquired a lot of SGI and Cray IP. The x86 has been done to death. Except for more cores and a faster bus there isn't much more R&D there. And I'm not really sure why they got into the chipset business in the first place. Intel and AMD had it helmed up leaving very little for a third competitor.

Their core competences are in GPUs, they have a lot of IP there. This is valuable for negotiating licenses against the likes of Intel. And Intel's only dominance is in low margin integrated GPUs. Which is great for retailers but not great for the R&D team.

Re:Doesn't look good for Nvidia (0)

Anonymous Coward | more than 4 years ago | (#29697021)

Scientific computing is a market with deep pockets - I wouldn't discount it so easily. Furthermore, as GPUs become easier to program, thanks to CUDA and OpenCL, you can expect new markets to emerge. For example, image processing routines that take on the order of seconds or minutes to complete on serial architectures can be reduced to the order of milliseconds on massively parallel architectures. This has the effect of making a larger class of algorithms suitable for real-time applications on commodity hardware - that's a big deal!

I'm hoping to hell... (2, Insightful)

MrNemesis (587188) | more than 4 years ago | (#29694827)

...that nVidia are at least giong to make a stab at providing graphics-enabled southbridges or something... as for things like HTPC's an Intel CPU + nVidia integrated graphics is brilliant. If I'm in the market that's looking for integrated graphics (in the case of HTPC's, power usage and space considerations) then the GPU is more important than the CPU... and I find myself being pushed to AMD for the whole platform.

Intel is really shooting themselves in the foot with all the bus licensing stuff IMHO. By scaring off nVidia IGP's, they're left with their own mediocre offerings which, in my experience, are vastly inferior even in graphics tasks that don't involve 3D.

If nVidia can supply us with miniscule IGP's-on-a-PCIe-stick-for-a-tenner then great, but their recent developments seem to be pushing themselves into niche applications (bigger and bigger GPU dies primarily) and I'm worried an Intel platform will make me choose between Intel IGP or a power-guzzling graphics card. Heck, pretty much every machine I've built for others in the last five years has come with an ATI or nVidia IGP because I don't know anyone that games.

Disclaimer: I have every type of GPU in my house; I use nVidia IGP's for all my HTPC's since they're the only ones that are consistently good for HD content under both windows and Linux. Intel IGP's suck for video (my X3100 can't keep up with SD x264 scaled over a 1900x1200 screen without tearing and lag) but are fine for my laptops (low power usage preferred), and a mix of ATI and nVidia grpahics cards on the machines that need 3D. I was annoyed enough when nVidia IGP's stopped appearing for AMD boards, but not having them at all will be a serious pain in the arse.

Re:I'm hoping to hell... (1)

obarthelemy (160321) | more than 4 years ago | (#29696623)

have you tried AMD IGP's ? theyr're quite good for HTPC.

This would be more interesting... (2, Informative)

Anonymous Coward | more than 4 years ago | (#29694861)

... if it weren't a complete fabrication [hardocp.com] .

Who modded this up? (2, Informative)

pavon (30274) | more than 4 years ago | (#29695115)

That has absolutely nothing to do with the story in question. It is a refutation to the ridiculous claim that "Nvidia is abandoning the entire high end and mid-range graphics market".

Re:This would be more interesting... (3, Informative)

MrNemesis (587188) | more than 4 years ago | (#29695125)

That's a different fabrication that was rebutted a couple of days ago. The exiting of the chipset market appears to be real, as they've put [techreport.com] . [techreport.com]

Re:This would be more interesting... (1)

MrNemesis (587188) | more than 4 years ago | (#29695159)

Damnit, should have used preview. That should have read "as they've put all current R&D on hold [techreport.com] ".

Re:This would be more interesting... (1)

Hognoxious (631665) | more than 4 years ago | (#29696035)

Is that the one about the graphics card that's made of wood? That means it's a witch. Burn it!

This is False (5, Informative)

Sycon (1622433) | more than 4 years ago | (#29694879)

http://www.tomshardware.com/news/nvidia-gpu-graphics-chipset,8821.html [tomshardware.com] They have explicitly stated they have no intention of leaving the chipset business.

Re:This is False (1)

Dyinobal (1427207) | more than 4 years ago | (#29694995)

Interesting. Something on the internet that isn't true. Worse it's on slashdot..oh to bad it's not the glory days when everything on the internet was true and you didn't have to worry about hoaxes or fake news stories.

No, it isn't (1)

Groo Wanderer (180806) | more than 4 years ago | (#29695117)

So, they are stopping development of AMD chipsets, and stopping development of the Intel chipsets, leaving.... what again?

And their triumphant "no we are not" leaving statement amounts to, "We are going to sell the ones we have designed". Great. As long as Intel makes FSB chips, they can continue to trickle out older chipsets. But no new ones. And they aren't leaving. And there are no American tanks in Baghdad.

Come on, the only reason they are countering this is because the financial community is noticing, and that might downgrade the stock. Even an extremely slow monkey can read what they are saying.

                    -Charlie

Re:No, it isn't (1)

Sycon (1622433) | more than 4 years ago | (#29695947)

That leaves the ION for the moment. As for stopping the development of AMD chipsets, mostly we just wait and see what they announce soon. They already have chipsets that cover all released AMD models, so its not exactly a damage move for them to simply keep selling what they have until they can move forward their Intel chipsets alongside the AMD market.

Heh heh (2, Interesting)

Groo Wanderer (180806) | more than 4 years ago | (#29696081)

Try plugging an SSD into one of those chipsets and see how far you get. Especially an Intel SLC SSD. Then go look for a patch on Nvidia's site.

Intel had to patch around Nvidia's bugs, Nvidia wouldn't. There is a long list of these things.

They may exist, but I wouldn't call them good. They essentially haven't been touched, just renamed, and are seriously showing their age.

                -Charlie

Re:Heh heh (1)

Sycon (1622433) | more than 4 years ago | (#29696279)

And I'm not saying they're good right now, but why in the world would Nvidia bother going to court over this if they have no intention of continuing chipset development? The point is that these rumors are simply based on a different interpretation of the same public information everyone has. And so far there is no reason to believe they are leaving the chipset industry for good.

Re:This is False (1)

ElectricTurtle (1171201) | more than 4 years ago | (#29695171)

If you always believe what a company says about itself, I have some bridges that are just coming on the market that might interest you as an excellent, ground-floor, turnkey investment opportunity.

Re:This is False (1)

Sycon (1622433) | more than 4 years ago | (#29695881)

And you always believe every rumor you hear don't you?

Re:This is False (0)

Anonymous Coward | more than 4 years ago | (#29695357)

It really doesn't seem to be false. Wall Street thinks it's true:

http://www.fool.com/investing/general/2009/10/09/nvidia-cant-handle-the-heat-gets-out-of-the-kitche.aspx

Re:This is False (1)

Dunbal (464142) | more than 4 years ago | (#29695635)

Wall Street thinks it's true:

      Wall street lives in its own inertial reference frame, and is only vaguely connected to reality.

PS: Keep buying those stocks, I need buyers for my shorts.

Re:This is False (2, Interesting)

default luser (529332) | more than 4 years ago | (#29695383)

That's great. Nvidia is outselling ATI chipsets by dumping stock of their Nforce4 (that is what the MCP61 is, you'd know these things if you read the PCPer article linked in the summary), a chipset from 2006 that doesn't even support PCIe 2.0. If that's not a sign of things to come, I don't know what is.

And Nvidia is developing ONE new chipset - ION2, for Apple. Since the rest of the world is moving-on to mobile i7/i5/i3, and even Atom is getting on-die graphics, I can't forsee Nvidia really investing anything in future chipset tech.

Re:This is False (2, Interesting)

TJamieson (218336) | more than 4 years ago | (#29695637)

Hmm.. I've got an MCP61-based AMD system, and it also has PCI-Express 2.0. YMMV?

Re:This is False (0)

Anonymous Coward | more than 4 years ago | (#29695415)

I don't know; that statement didn't really do anything for me. I guess it is just not worded strongly enough. Instead of using lame marketing words like "innovate", why don't they just come out and say the following:

"NVIDIA will continue to develop and produce new chipsets for both the Intel and AMD platforms for the foreseeable future."

Short, simple, and to the point.

Instead, they have this long-winded non-statement that doesn't really do anything to dispel the rumor. The PC Perspective columns dissects the statement and comes to the conclusion that it doesn't mean anything; I agree with their assessment.

Re:This is False (1)

cjHopman (810457) | more than 4 years ago | (#29695477)

It's a good thing that you RTFA. That is how you knew that the second one was a detailed analysis of the exact statement from Nvidia that you linked to, showing how it did not really refute the point.

Re:This is False (0)

Bigjeff5 (1143585) | more than 4 years ago | (#29695623)

Er, in that article they explicitly state they are not producing any chipsets for future CPUs due to legal issues:

But because of Intel's improper claims to customers and the market that we aren't licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs.

That is, in a nutshell, exactly what the summary says - that they will continue to sell their chipsets for the older FSB processors only.

Now, they are apparently suing Intel on that score, and if they win they will probably jump back in the chipset market, but Nvidia themselves have said in the article you referenced yourself that they are currently not going to produce any chipsets for new processors. They will only be selling chipsets for processors they have already developed.

I'm assuming you didn't score too well in reading comprehension back in school.

Re:This is False (1)

Sycon (1622433) | more than 4 years ago | (#29695861)

I'm assuming you didn't score too well in reading comprehension back in school.

Funny, I was going to say the same about you:

So, until we resolve this matter in court next year, we’ll postpone further chipset investments for Intel DMI CPUs.

Also:

Despite Intel's actions, we have innovative products that we are excited to introduce to the market in the months ahead. We know these products will bring with them some amazing breakthroughs that will surprise the industry, just as GeForce 9400M and ION have shaken up the industry this year.

We expect our MCP business for both Intel and AMD to be strong well into the future.

Perhaps you should try actually reading the article before posting a response.

Re:This is False (1)

Hurricane78 (562437) | more than 4 years ago | (#29696027)

Or did they? I know from experience, that companies all the time state that they have no intention of doing something *ever*... until the day where they actually do what they had long planned and just wanted to keep secret.

Of course that makes such a company look like complete untrustworthy idiots. But hey, managers are managers for a reason (= huge ego. Everything that makes them look bad "does not exist"). ^^

Re:This is False (1)

Sycon (1622433) | more than 4 years ago | (#29696117)

Its true but there are rumors about companies that get publicized every day. Most of them simply aren't true. The point is that right now there's not real evidence that Nvidia is planning a long term halt of chipset development, so all we can go off is what the company releases.

Intel failing to learn lessons from IBM (1)

iamacat (583406) | more than 4 years ago | (#29694887)

x86 would go nowhere if only IBM could make PCs, only open OEM market achieved dominance of competitors like Apple or Commodore. If Intel is not letting other people release chipsets/motherboards for it's own processors but AMD is free for all, any technical advantages of Core/Xeon would not be enough to slowly erode the market share in favor of a more open product.

Not quite right... From Ken Brown at Nvidia... (5, Informative)

Anonymous Coward | more than 4 years ago | (#29694969)

Reported at HardOCP... http://www.hardocp.com/news/2009/10/08/nvidia_statement_on_chipset_business

NVIDIA's Ken Brown wanted to give us NVIDIA's thoughts on the current state of its chipset business. So here it is in its full text.

Hi,

We've received a number of inquiries recently about NVIDIA's chipset (MCP) business. We'd like to set the record straight on current and future NVIDIA chipset activity.

On Intel platforms, the NVIDIA GeForce 9400M/ION brands have enjoyed significant sales, as well as critical success. Customers including Apple, Dell, HP, Lenovo, Samsung, Acer, ASUS and others are continuing to incorporate GeForce 9400M and ION products in their current designs. There are many customers that have plans to use ION or GeForce 9400M chipsets for upcoming designs, as well.

On AMD platforms, we continue to sell a higher quantity of chipsets than AMD itself. MCP61-based platforms continue to be extremely well positioned in the entry CPU segments where AMD CPUs are most competitive vs. Intel

We will continue to innovate integrated solutions for Intel’s FSB architecture. We firmly believe that this market has a long healthy life ahead. But because of Intel’s improper claims to customers and the market that we aren’t licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we’ll postpone further chipset investments for Intel DMI CPUs.

Despite Intel's actions, we have innovative products that we are excited to introduce to the market in the months ahead. We know these products will bring with them some amazing breakthroughs that will surprise the industry, just as GeForce 9400M and ION have shaken up the industry this year.

We expect our MCP business for both Intel and AMD to be strong well into the future.

Let me know if you have any questions, and thanks for your interest.

Best,

Ken

Old news (1, Interesting)

Groo Wanderer (180806) | more than 4 years ago | (#29695061)

This isn't new, they knifed it a year+ ago. I wrote it up then:
http://www.theinquirer.net/inquirer/news/1021993/nvidia-chipsets-history [theinquirer.net]
and no one believed it. Now that NV has no choice but to admit it, they stopped pretending. Yay?

They are doing the same thing about their "not killing" the GTX285/275/260, it is just a temporary shortage or some twaddle. This one won't take a year to admit though.

                      -Charlie

Re:Old news (1)

default luser (529332) | more than 4 years ago | (#29695189)

Charlie - you claim that Nvidia will be dropping their midrange graphics chipsets, but offer no explanation why. While I tend to agree with your insight, I can't see why Nvidia would be willing to give-up marketshare just to staunch the bleeding a little. I mean, what the hell else does Nvidia make money off of, aside from midrange graphics (Tegra? too early to tell. Chipsets? They're gone. HPC? Small market.)? It would be foolish to allow their one remaining profitable enterprise to languish.

But I have to believe you're right, because Nvidia shows none of the normal signs of competing. Normally when ATI releases something better, and Nvidia wants to compete, they introduce some impressive price drops, and we're not seeing that this time around. Is Nvidia really going to do something as stupid as sacrifice marketshare just to save a few dollars?

Re:Old news (0)

Anonymous Coward | more than 4 years ago | (#29695309)

It's quite simple. Nvidia has graphics chips that cost $x to create. If they can't sell them for more than that due to competition with another company, then what is the point of creating said chips? Building something and selling it for a loss is a losing strategy. It's better to regroup and try again with something that makes you money.

What Nvidia needs are cheap powerful chips. What they have are expensive not-so-powerful chips that soon no one will want to buy. It's quite simple, their engineering dept. needs completely new leadership or they soon will be an ex-company.

Re:Old news (1)

default luser (529332) | more than 4 years ago | (#29695797)


It's quite simple. Nvidia has graphics chips that cost $x to create. If they can't sell them for more than that due to competition with another company, then what is the point of creating said chips? Building something and selling it for a loss is a losing strategy. It's better to regroup and try again with something that makes you money.

What Nvidia needs are cheap powerful chips. What they have are expensive not-so-powerful chips that soon no one will want to buy. It's quite simple, their engineering dept. needs completely new leadership or they soon will be an ex-company.

I guess you're right. GT200b is a lot less expensive to make than GT200, but the build cost of the board has never come down. The 512-bit bus means the PCBs still require more layers, and board makers must include 16 memory chips in every build.

Since the high-end PC gaming market has partially stalled, there's no demand for a 2GB single GPU card at the top-end, so Nvidia can't even take advantage of increasing GDDR3 memory densities. So those GTX285 boards are still shipping with paltry 512Mbit chips.

Since you asked (1, Troll)

Groo Wanderer (180806) | more than 4 years ago | (#29695683)

I was trying not to pimp my own stuff, but since you asked.....

Short story #1, the G200b based cards are huge and need expensive PCBs. They cost more to make than the upcoming and likely faster ATI Juniper parts, so NV will have to wrap a $20 bill around each card to make them sell. Not a long term good business plan. I can't say more because I was prebriefed on the ATI cards and agreed not to talk about them. When you read this, keep in mind that I gave Nvidia a very generous benefit of the doubt. You will understand why a lot better next week or so.
http://www.semiaccurate.com/2009/10/06/nvidia-will-crater-gtx260-and-gtx275-prices-soon/ [semiaccurate.com]

Short story #2, a short while after I finished the above story, I got a call detailing how the GTX260/275/285 and possibly 295 were being killed. I wrote it up here:
http://www.semiaccurate.com/2009/10/06/nvidia-kills-gtx285-gtx275-gtx260-abandons-mid-and-high-end-market/ [semiaccurate.com]

If you go back and look, the Nvidia denials and attacks against me are personal and do not address the facts, just attack the messenger. Kyle posted one from Ken Brown at Nvidia here:
http://www.hardocp.com/news/2009/10/07/nvidia_abandons_market6363636363/ [hardocp.com]
Note HOW they say it, and what they do NOT say. They did the EXACT same thing a year ago when they were denying the chipset knifings. You could almost take this as desperate spinning because their pants are so firmly around their ankles that they can't run, and they can't refute the facts because I am right.

Then again, what do I know.

            -Charlie

Note: Cue the Nvidia fanbois in 3.... 2.... 1.....

Re:Since you asked (2, Informative)

default luser (529332) | more than 4 years ago | (#29696051)

Thanks Charlie. You don't have to say any more about ATI, because the cat's already out of the bag (some site broke the Tuesday NDA). They'll be moving exclusively to GDDR5 on 128-bit bus for their midrange parts. This means that right now, they could sell a cheap 512MB 5850 with 4 memory chips for next to nothing. And once the 2Gbit GDDR5 parts ship next year, those 1GB 5770 parts can be paired with just 4 memory devices, and could probably be sold for the same cheapo $100.

The power of a 4890 (almost) for around $100 six months from now? It's certainly possible, and it's just amazing what GDDR5 brings to the table!

Sounds to me like the venerable GDDR3 is finally headed for that big tech dump in the sky. It only took five years!

Re:Old news (2, Interesting)

Zoson (300530) | more than 4 years ago | (#29695617)

And then you were promptly fired for writing FUD.

Nobody believes a word you say. You lost all credibility long ago.

It's just a shame the inquirer has not removed your negative, blatantly biased garbage.

Re:Old news (0)

Anonymous Coward | more than 4 years ago | (#29696001)

Charlie,

Just curious, I've been following these stories from you for a while. I've seen a number of your stories come true as things you wrote have been proven. Right now the FUD is that you got fired by the Inq for as being discredited on all of this about Nvidia. Can you set the record straight about what the whole deal is please?

I understand Inq doesn't allow signing NDA's, so how does it figure you supposedly broke one and that was the source for the whole Nvidia thing? For whatever it's worth I consider it to speak volumes simply because you haven't been sued for libel. I also bought an ATI graphics card over Nvidia based on what you have written after following the story for quite a while.

Re:Old news (0)

Anonymous Coward | more than 4 years ago | (#29696105)

TheInquirer are a bunch of rumour mongering asshats.

Um... the facts for this were obtained where? (1, Informative)

Anonymous Coward | more than 4 years ago | (#29695073)

NVIDIA has clearly stated that this is not the case in a press release as recently as... yesterday:

http://www.tomshardware.com/news/nvidia-gpu-graphics-chipset,8821.html#xtor=RSS-181

Let me be the first to say.... (1)

fataugie (89032) | more than 4 years ago | (#29695331)

Fuckin' A.

I never thought it would come to this and I'm sorry to see them go.

Erm... Hello? Apple? (2, Insightful)

magus_melchior (262681) | more than 4 years ago | (#29695337)

They have a huge contract with Apple as they've adopted NVidia chipsets for pretty much the entire Mac product line. Given that Jobs would preemptively shift to another chipset platform in the last round of announcements if this were even remotely true, I seriously doubt that NVidia would even think of limiting further R&D in their chipsets to Ion 2.

Unfortunately I'm used to the editors slipping at least twice a day...

Re:Erm... Hello? Apple? (1)

Bigjeff5 (1143585) | more than 4 years ago | (#29696323)

You know Apple switch to x86 architecture a while ago and uses Intel processors exclusively, right?

If Nvidia can't produce chipsets for the new Intel processors, that deal is only going to last as long as the FSBs remain marketable. As soon as DMI is the norm from high end to low end Nvidia won't be selling chipsets to anybody.

Sure, it will be a while, but that deal was doomed as soon as it was written - it is not a long term contract.

Maybe we are moving away from Intel .. (1)

savuporo (658486) | more than 4 years ago | (#29695489)

Maybe this is a sign that NVIDIA is going more towards ARM, that has always been a system-on-a-chip architecture. Tegra lineup is a very nice product already, with ARM going Cortex-A9 and multicore this year, maybe Nvidia just has a more important space to play in, than to tinker around with x86 chipsets ?

This is FUD and should be removed. (4, Informative)

Zoson (300530) | more than 4 years ago | (#29695545)

nVidia has published an official response.
http://hardocp.com/news/2009/10/08/nvidia_statement_on_chipset_business

Finish the drivers, Intel and AMD (3, Insightful)

Sloppy (14984) | more than 4 years ago | (#29695579)

It looks like long-term, Intel and AMD/ATI are going to be the only games in town. That wouldn't worry me a whole lot, because I think their stuff looks good on paper, and they'll compete. And both of them are slowly advancing their open source drivers. But the key word is "slowly." If, say, you want to buy a machine to use as a MythTV box or something like that, right now NVidia is currently the only one it makes sense to buy. Anybody else, and you're going to have to decode your video with CPU and read promises about how some day you might not have to. I hate reading promises.

I am not looking forward to the day when these two windows of acceptability don't overlap. What happens you want to build a box and neither Nvidia nor Intel not AMD have a product that can actually be used, either because they're gone (Nvidia) or their drivers aren't yet working (Intel and AMD)? That is going to suck.

Re:Finish the drivers, Intel and AMD (2, Insightful)

linhares (1241614) | more than 4 years ago | (#29696057)

I hate reading promises.

Well then the Nobel committee won't have you.

Re:Finish the drivers, Intel and AMD (1)

Late Adopter (1492849) | more than 4 years ago | (#29696349)

I think you're confused. nVidia isn't leaving the graphics card business. Just the mainboard chipset market (allegedly). I suppose this will mean fewer integrated video solutions based on nVidia, but you'll always be able to go buy a discrete PCI Express 2.0 card for your MythTV box. And on top of that, Intel has really good open drivers for their mainboard chipsets, so the combination of the two could actually make good sense for your situation.

So Intel killed Nvidia? (1)

Snaller (147050) | more than 4 years ago | (#29695655)

Really? That can't be good.

sli? (1)

t8z5h3 (1241142) | more than 4 years ago | (#29696123)

what about SLI is that Dead to? and also what is the status of the intel chipsets with sil support? is this a sign that Crossfire has won?

Damn! Damn! Damn! (1)

drbuzz0 (1638167) | more than 4 years ago | (#29696579)

At the risk of sounding like a fanboy, I LOVE nVidia hardware and have always opted for nVidia chipsets whenever possible. I've had excellent experience with them, especially when it comes to drivers and support. They just plain work. I've never had a driver conflict or any other bug happen. They work the first time, every time. No comparability issues, no bugs, no need for patches etc. Always smooth as silk.

That's just my own personal experience and I'm sure someone out there will dispute it, but for me, consistent positive experiences like that are something no advertising can buy. I've had enough experience with products from Intel, AMD, ATI (before and after merging with AMD), Matrox, Creative and... well, you name it. They generally work fairly well, but none have been as headache free as nVidia.

Good riddance (1, Funny)

Anonymous Coward | more than 4 years ago | (#29696607)

Just like when VIA announced they'd stop making chipsets:

Good riddance, nVidia. Take your bowl of broken, buggy chipsets with you on the way out.

Next up: SiS.

I'd have mentioned ServerWorks, but they got brought by Broadcom. Oh, in that case:

Next up: Broadcom.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>