Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel To Pay NVIDIA Licensing Fees of $1.5 Billion

Soulskill posted more than 3 years ago | from the not-exactly-pocket-change dept.

Intel 135

wiredmikey writes "NVIDIA and Intel have agreed to drop all outstanding legal disputes between them and Intel will pay NVIDIA an aggregate of $1.5 billion in licensing fees payable in five annual installments, beginning Jan. 18, 2011. Under the new agreement, Intel will have continued access to NVIDIA's full range of patents."

cancel ×

135 comments

Sorry! There are no comments related to the filter you selected.

Wonder if Intel.. (4, Interesting)

RightSaidFred99 (874576) | more than 3 years ago | (#34829684)

Wonder if Intel will be able to use any of NVidia's patents to bolster their GPUs, which is really their only sore spot at the moment (Atom vs. ARM might be a sore spot, but there's hope there).

Re:Wonder if Intel.. (1)

icebike (68054) | more than 3 years ago | (#34829780)

Wonder if Intel will be able to use any of NVidia's patents to bolster their GPUs, which is really their only sore spot at the moment (Atom vs. ARM might be a sore spot, but there's hope there).

I rather suspect Intel was using Nvidia's patents all along, and that was what the big fight was all about.

I doubt they will stop now, since they are now paying for the privileged. I also suspect Nvidia got something besides money in return, such as access to certain Intel patents or something.

Re:Wonder if Intel.. (2)

icebike (68054) | more than 3 years ago | (#34829820)

I note that AMD CEO Meyer resigned. [mercurynews.com]

Perhaps this agreement was the writing on the wall for him?

Re:Wonder if Intel.. (2)

alvinrod (889928) | more than 3 years ago | (#34830068)

It could be coincidence, or it could be incredibly telling.

The best thing AMD had going was that Intel's onboard GPU's sucked. AMD has a new chip architecture coming out in the next few months and no one really knows how well it performs, except AMD. It was pretty much a given that they would have a better integrated GPU since they have ATI building it, but the CPU portion of the chip is still an unknown.

We'll assume that for the sake of argument that AMD knows that on the CPU side they'll be weaker, but that on the GPU side they'd be stronger. If Intel can grab as many of nVidia's goodies as they'd like, the GPU side of the equation will probably cancel out, leaving AMD without a pot to piss in.

Or, it could just be coincidence.

Re:Wonder if Intel.. (2)

cheater512 (783349) | more than 3 years ago | (#34830294)

Although AMD is cheaper, and when you are squabbling about a minor speed decrease, over 4 cores to read your email, its pretty irrelevant.
Might be relevant for work horse scenarios but nothing else.

Re:Wonder if Intel.. (1)

alvinrod (889928) | more than 3 years ago | (#34830440)

They're not cheaper because they want to be, they're cheaper because they have to be. They can't compete on performance, so they try to do it on price. Look at the financial results for the last quarters for both companies. Notice that one made landmark profits whereas the other suffered a loss.

I hope AMD can stay in the game, if for no other reason than that Intel got incredibly lazy with NetBurst and if AMD hadn't stepped up, Intel probably wouldn't have cared all that much.

Re:Wonder if Intel.. (2)

Rockoon (1252108) | more than 3 years ago | (#34832556)

They're not cheaper because they want to be, they're cheaper because they have to be. They can't compete on performance

Intel is more expensive because they have to be. They can't compete on value.

See what I did there?

The only metric worth anything is performance per dollar. You have not used that metric but tried to draw a conclusion as if you did.

Before you reply in fanboy rage, lets try it with cars:

Ford is not cheaper (than Ferari) because they want to be, they're cheaper because they have to be. They can't compete on performance, so they try to do it on price.

With cars your bullshit logic has no teeth. Now why the fuck would it apply to processors but not cars?

Intel has a very large range of processor models at a very large range of price points. Yet they don't top the charts with whats on the market today. [cpubenchmark.net]

Intel charges more because it has brand recognition which was propped up with the illegal activities of this convicted monopolist.

Re:Wonder if Intel.. (1)

interkin3tic (1469267) | more than 3 years ago | (#34831450)

When you are squabbling about a minor speed decrease, over 4 cores to read your email, its pretty irrelevant.

Not true, some of my family members have bloated their e-mails to keep pace with Moore's law.

"Wow, my new computer can handle 200 animated gifs of kitties at a time!!! [loads up outlook]"

Re:Wonder if Intel.. (1)

gfody (514448) | more than 3 years ago | (#34831244)

Integrated GPUs suck in general and will continue to suck for some time. Let's assume for the sake of argument that AMD and Intel both understand that x86 performance matters little and the future is power efficient SOCs with decent graphics performance. Let's also have a look at what they've done recently:
AMD - created an atom-like chip with decent graphics performance
Intel - created an even more powerful x86 chip with OK graphics performance
NVidia - announced an ARM chip together with MS announcing windows support for ARM

So long Wintel, Hello Winvidia! I think it's clear that AMD and Intel are crapping their pants and swiftly ousted AMD's CEO to prepare for the merger of AMD and Intel.

powerful enough! (0)

Anonymous Coward | more than 3 years ago | (#34831490)

not everybody plays crysis, you insensitive clod!

Re:Wonder if Intel.. (1)

Moryath (553296) | more than 3 years ago | (#34832090)

So long Wintel, Hello Winvidia! I think it's clear that AMD and Intel are crapping their pants and swiftly ousted AMD's CEO to prepare for the merger of AMD and Intel.

Merger of AMD and Intel? Not likely. The phrase "antitrust lawsuit" comes swiftly to mind.

More likely as well, what you'll get for ARM-based Windows is an attempt to compete in the tablet/smartphone market. Let's face it, Win7phone is a joke, Android OS was never designed for tablets and Google has admitted so, leaving the need for "something" to compete with iPads down the road.

Dell did a pretty good job with the Inspiron Duo, but Atom-based will only go so far.

Re:Wonder if Intel.. (1)

hairyfeet (841228) | more than 3 years ago | (#34832632)

Sorry I can't remember where I read it, but out of all the theories of Windows ARM I read exactly ONE that made sense: .NET. The poster wrote that Windows ARM will of course not be x86 Windows but based around .NET so that one can compile an app for Windows ARM and it'll run just fine on x86 Windows as well. This will allow developers to "write once, run twice" without having to rewrite. That made a hell of a lot more sense to me than trying to fit some sort of backwards compatibility for X86 onto ARM, or trying to get those millions of Windows apps ported.

As for TFA, I know the DOJ here in America is sucking corporate cock so hard it makes a Hoover vacuum look gentle, but why hasn't the EU or somebody done shit about Intel? I mean winning market fair and square? yippy skippy, I'm happy for you. But what Intel is doing is basically using every dirty trick in the book and then throwing some money at the competitor (without changing their ways ONE bit) when they get caught.

First they bribed the OEMs and rigged the compilers against AMD, which caused long term serious damage to AMD ( because one could easily argue that the OEMs wouldn't have taken the shitty pig netburst over the much better AMD chips if it weren't for bribes, thus making AMD's financial situation MUCH worse) and then when it looks like AMD is on the ropes but ready to drag them to court they throw some money at them AND get to continue using AMD patents, then they screw Nvidia out of being able to make chipsets for the new chips, thus killing their chipset business dead, and now they throw them some money to fuck off AND get access to their patents. See a pattern here?

Time and again Intel is getting rewarded for rigging the market and making sure they have no real competition. this would be like allowing MSFT and Apple to bribe handset OEMs not to use anything Linux based and when they get caught they throw some money at the Linux foundation AND get a free pass to use any Linux patents. The free market simply doesn't work if we allow the 800 pound gorillas to simply bribe and rig the system in their favor, just look at the damage MSFT was able to do with IE thanks to their 90s dirty tricks. If someone doesn't do something about Intel we are gonna end up with a single supplier of CPUs and we can go back to the lovely days when a PC cost a minimum of a couple of grand. Is that REALLY what we want to see happen?

After the bribery and compiler rigging came out I quit buying, building, or selling Intel and those that care about the free market should do the same. Unless you are in one of the tiny niches where money is no object and every drop of speed is critical the AMD chips are both cheap and well performing, and their IGPs and discrete GPUs stomp the hell out of Intel. If the won't do a damned thing to stop it then I urge my fellow geeks to stand up. WE are the ones that make the buying choices for our families, coworkers, and places of business, and WE can say in a single voice "Rigging the market is NOT allowed!". Even in the workstation market one can buy an AMD server board with dual 8 core CPUs for less than the price of a single top of the line Intel chip. So make yourselves counted, push AMD where you can. This rigging and bullying the market simply can't be allowed to continue.

Re:Wonder if Intel.. (1)

ogdenk (712300) | more than 3 years ago | (#34832660)

Integrated GPUs suck in general and will continue to suck for some time.

A few years ago I'd agree but the GeForce 9400M in my 2009 Macbook is certainly more capable of a GPU than most users utilize. The newer incarnations of it certainly don't suck for most apps and are well-supported for OpenCL apps under MacOS X at least.

If you want to play the latest and greatest at 1600x1200 at max settings and 4x antialiasing it ain't gonna cut it but for light gaming and ACTUAL WORK they do just fine.

Now Intel's integrated GPU's suck. So do most of the ATI offerings but they have a few that are adequate. Not sure about OpenCL support on the ATI integrated GPU's however.

Re:Wonder if Intel.. (0)

Anonymous Coward | more than 3 years ago | (#34830280)

From the article:

Intel recognized an expense of $100 million in the fourth-quarter of 2010, classified as "marketing, general and administrative."

Maybe a piece of that cake?

Re:Wonder if Intel.. (0)

Anonymous Coward | more than 3 years ago | (#34829970)

Intel CPU & Nvidia GPU linked by QPI.

Nvidia goes fabless like AMD but using Intel facilities.

Intel announcing Atom & Nvidia shared cores.

Re:Wonder if Intel.. (1)

Locke2005 (849178) | more than 3 years ago | (#34830214)

If Intel was already using nVidia's patents, then why did their graphics chips suck so badly? You'd think if they were stealing nVidia ideas, they could come up with better graphics, wouldn't you?

Re:Wonder if Intel.. (1)

icebike (68054) | more than 3 years ago | (#34830308)

Follow the money.

If they weren't already using their patents, do you think they would be paying 1.5 Billion?

Re:Wonder if Intel.. (1)

eggnoglatte (1047660) | more than 3 years ago | (#34831482)

Because Intel just doesn't take drivers seriously. Unlike, say, audio or networking hardware, the drivers for GPUs implement a lot of the actual API (OpenGL or Direct 3D). Drivers are a crucial component in the GPU system design, and Intel just never got that.

That is one of the (several) reasons why NVIDIA won't open source their drivers. They actually do have significant trade secrets in that domain.

Re:Wonder if Intel.. (0)

Anonymous Coward | more than 3 years ago | (#34831856)

Try the memory department with ram striping/channeling and memory controller integrated into the cpu possibly. Hypertransport.

Just think outside video cards.

Re:Wonder if Intel.. (1)

Renraku (518261) | more than 3 years ago | (#34830098)

If Intel needs anything, it's to work on their integrated graphics for laptops. Intel graphics chipsets are so bad that they struggle immensely with low resolution flash animations.

Re:Wonder if Intel.. (1)

smash (1351) | more than 3 years ago | (#34830192)

No they don't. They're fine for business grade laptops performing business grade activity. I've had several of them. An OPTION for a high end part would be nice, but for my ultra portable 12.1" work machine i do not miss the lack of nvidia GPU at all.

Re:Wonder if Intel.. (1)

KiwiSurfer (309836) | more than 3 years ago | (#34830198)

I have an Intel GPU on my desktop at home and it works fine with Flash aninmations. It does have difficulities with recent games I concede that, but Flash animations isn't too much of a challenge for Intel's GPUs.

Re:Wonder if Intel.. (1)

postmortem (906676) | more than 3 years ago | (#34830856)

Try running flash videos at 720P or 1080P. That's where AMD and nvidia hardware acceleration shine in.

Re:Wonder if Intel.. (1)

Nutria (679911) | more than 3 years ago | (#34832100)

But isn't GPU-assisted-Flash only about 6 months old?

Re:Wonder if Intel.. (1)

KiwiSurfer (309836) | more than 3 years ago | (#34832456)

Which I have no problem doing on my hardware. Seriously, take a look at the latest (read: 2 years old) stuff coming out of Intel and see how well it performs with common tasks required by 95% of users. Your example, HD YouTube videos, work pretty well on this 2 year old machine using integrated Intel graphics. I had a nVidia PCI-Expres card (with some 1GB of DDR RAM) for a while and I didn't notice much difference when I switched back to Intel graphics -- only notable difference was found in games released in the last few years or so.

please note the phrase (0)

Anonymous Coward | more than 3 years ago | (#34832232)

"integrated graphics FOR LAPTOPS"

does your desktop have a "laptop" chipset?

Re:Wonder if Intel.. (0)

Anonymous Coward | more than 3 years ago | (#34830624)

If Intel needs anything, it's to work on their integrated graphics for laptops.

because you are intimately familiar with mobile SB graphics performance?

Re:Wonder if Intel.. (1)

Nursie (632944) | more than 3 years ago | (#34832216)

Use patents to bolster their GPUs?

Hahahaha... you crack me up. It's almost as if you think that patents are a publishing of technological idea and techniques designed to excahnge a limited monopoly for full disclosure!

Poor fool, patents are a mechanism whereby an intentionally vague and generic description of something can be used for a legal land-grab and territorial pissing contest!

patents (0)

Anonymous Coward | more than 3 years ago | (#34829716)

and intel's employees will feel the hurt, while nvidia's employees won't feel a thing

Re:patents (-1)

Anonymous Coward | more than 3 years ago | (#34829916)

The first time I fucked your mom in the ass she felt the hurt, while now that I've done it so many times she doesn't feel a thing.

look what's left of DEC-Alpha employees... (2)

garyisabusyguy (732330) | more than 3 years ago | (#34830998)

after Compaq sold the rights to Intel

You want to assume that some of them are still working Alpha goodness into Intel products, but it is just as likely that they killed the tech and kept their talent out of the light of day

Re:look what's left of DEC-Alpha employees... (1)

gstrickler (920733) | more than 3 years ago | (#34831460)

My understanding is that a bunch of them "defected" to AMD shortly after Intel acquired the DEC/Alpha technology, but I don't have any confirmation of that.

Given Intel's impending divorce with Microsoft.... (1, Insightful)

tloh (451585) | more than 3 years ago | (#34829720)

umm...I for one welcome our new GeF-tel overlords?

I know, I know - but who cares if Microsoft != NVIDIA.

Still no x86 license. (1)

Lashat (1041424) | more than 3 years ago | (#34829778)

It looks like NVIDIA really is betting the company on ARM. Godspeed.

Re:Still no x86 license. (0)

Anonymous Coward | more than 3 years ago | (#34829804)

IANAL, someone enlighten me on when X86 patent runs out (am I even in the right ballpark, legally) ? I mean, X86 has been around.. forever.

Re:Still no x86 license. (1)

GigaplexNZ (1233886) | more than 3 years ago | (#34829986)

x86 itself has been around for a while, but recent developments such as x86-64 and other various instruction set extensions still have a long way to go before coming remotely close to expiring.

Re:Still no x86 license. (0)

Anonymous Coward | more than 3 years ago | (#34830122)

x86-64 is owned by AMD. Somehow I doubt they'll be licensing it to their main GPU competitor.

Re:Still no x86 license. (1)

sexconker (1179573) | more than 3 years ago | (#34830418)

x86-64 is owned by AMD. Somehow I doubt they'll be licensing it to their main GPU competitor.

Somehow I doubt they'd be legally allowed to NOT license it to their main GPU competitor.

Re:Still no x86 license. (1)

bws111 (1216812) | more than 3 years ago | (#34830976)

Of course they don't have to license it to a competitor. The whole purpose of patents is to be able to license (or not) your stuff to whomever you want. Sometimes a monopoly may be found guilty of anti-competitive behavior, and part of the remedy may be forced licensing of patents, but that is not normally the case.

Re:Still no x86 license. (1)

eggnoglatte (1047660) | more than 3 years ago | (#34831558)

It would be hard to argue that AMD has a monopoly, so therefore you are right - it is unlikely they'd be forced.

However, if NVIDIA really needs that tech, they can just start violating the patent. AMD sues. NVIDIA countersues AMD for violation of NVIDIA patents (it is almost guaranteed that MAD violates some of NVIDIA's patents). In the end, they either fight it out in court, or reach a settlement. Either way, the resolution of the conflict is that somebody determines the difference in value between the patent violations, and that difference gets paid. So either NVIDIA pays the difference to AMD, or vice versa. And afterwards, both parties have licenses of each other's patent portfolios. Kinda like what just happened between NVIDIA and Intel.

The only downside of this approach is that it might end up costing NVIDIA a pretty penny, but if they really think they need to use x86-64, this is one way to do so.

Re:Still no x86 license. (1)

Khyber (864651) | more than 3 years ago | (#34831050)

x86-64 is a CPU architecture.

Why the fuck should a GPU company get the rights to a CPU architectural extension?

besides, I bet they're already working with Intel's stuff - it would explain why the Fermi burned like a bitch first-gen, they must've tried implementing netburst.

Re:Still no x86 license. (1)

mikael (484) | more than 3 years ago | (#34831732)

Because a GPU caches texture memory in 2D and 3D pyramids (MIP-mapping), and a CPU does code and data page caching in 1D. Somehow, these two just might merge, especially with shader languages and trillion point data sets.

Re:Still no x86 license. (1)

Surt (22457) | more than 3 years ago | (#34832242)

x86-64 is a CPU architecture.

Why the fuck should a GPU company get the rights to a CPU architectural extension?

besides, I bet they're already working with Intel's stuff - it would explain why the Fermi burned like a bitch first-gen, they must've tried implementing netburst.

The fuck because they also used to be a chipset company in addition to a GPU company. That said, license to x86-64 is not the problem, its LGA1156 that's the problem.

Re:Still no x86 license. (1)

icebraining (1313345) | more than 3 years ago | (#34830066)

x86 probably already has, but not the subsequent extensions like SSE/3DNOW/amd64. Building a 486 CPU clone probably isn't very profitable.

Re:Still no x86 license. (1)

Locke2005 (849178) | more than 3 years ago | (#34830188)

People are still building 486 chips for embedded use, but it's a low-margin commodity market. You're correct, the big margins are only on the cutting edge chips, currently the multi-core CPUs.

Re:Still no x86 license. (1)

SuricouRaven (1897204) | more than 3 years ago | (#34830112)

What the other two said about extensions. Also, a modern CPU has a huge number of more recent technologies. Some obvious - hyperthreading, speedstep - and a lot of fine details I am sure in just how the piplineing, cache management, branch prediction and such all work. So, although you could make a processor not covered by any current patents, it's performance would be so pathetic it'd be of use only in embedded applications.

Re:Still no x86 license. (2)

TheRaven64 (641858) | more than 3 years ago | (#34830180)

There's no such thing as an 'x86 patent'. There are, however, parts of the x86 architecture that are difficult to implement without using certain patented techniques. Most of these come from the later generations and new ones are added with each new release by AMD or Intel. Both have a cross-licensing agreement, so they can use each others' patents, but new entrants into the market have a problem. Cyrix worked around this by having IBM (which also has a cross-licensing deal with Intel) fab their chips.

Re:Still no x86 license. (1)

petermgreen (876956) | more than 3 years ago | (#34830394)

IANAL, someone enlighten me on when X86 patent runs out (am I even in the right ballpark, legally) ? I mean, X86 has been around.. forever.
It depends what you mean by x86. x86 is being constantly extended and those extensions are almost certainly being patented.

Any patents on 8086 itself will almost certainly have long expired. Hell even the 80486 is probablly pretty much clear by now (though I can't say for sure due to craziness of patent law in some countries including the US) and IIRC there are a few companies out there making 486 clones for the embedded market (though I can't seem to find any details right now).

BUT IIRC Intel and AMD both hold a lot of patents on both modern extensions to x86 and on fast ways of implementing x86 and are probablly unwilling to license them. This means the only option for a new entrant is to go ahead anyway and hope they have enough retaliatory patents to get a settlement that lets them continue.

Re:Still no x86 license. (1)

postmortem (906676) | more than 3 years ago | (#34830978)

Software implementation of new instruction sets was always trailing years. Simply because you would have to re-write already working and tested code. For what benefit? What if performance was OK with old instruction set? I suspect making a CPU with arch. equivalent to Pentium I and runs at 1000MHz is fairly easy if there is no licensing cost and spec is open. And is more that good enough to drive Windows XP - which is still current.

Re:Still no x86 license. (1)

Surt (22457) | more than 3 years ago | (#34832228)

It ran out. But what nvidia needs to go back to making intel chipsets is access to the patents on the chip interface (e.g. socket 1156/1155), which are recent (last 5 years).

Re:Still no x86 license. (1)

smash (1351) | more than 3 years ago | (#34830204)

Given that the market for ARM is many times larger than x86 (mobile phones, tablets, desktop phones, embedded systems), thats not such as bad move.

Re:Still no x86 license. (1)

jonwil (467024) | more than 3 years ago | (#34830584)

Why doesn't NVIDIA buy Via? They get the x86 license (and given the recent rulings by the FTC that Intel are abusing their market power, I think Intel would be dumb to end Via's x86 license just because they were acquired by NVIDIA)

NVIDIA could use the acquired CPU tech alongside its ION GPU tech to produce a viable competitor to ATOM in the netbook space.

Re:Still no x86 license. (1)

Lashat (1041424) | more than 3 years ago | (#34830930)

NVIDIA does not have the money to buy VIA. They are HUGE in China.

Re:Still no x86 license. (2)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#34830950)

I can only imagine that NVIDIA has concluded that, without any fab abilities of their own, playing #3 in the x86 market would be a cruel, low-margin game(assuming they even managed to make a profit at it). You already have Intel, whose GPUs are anemic; but who has the best core designs and superb in-house fabs. Then you have AMD, whose cores and (formerly in house) fab capabilities lag those of Intel; but whose GPUs are approximately equal, on average, with NVIDIA's, and whose CPUs kick the hell out of Via's.

There just isn't a pleasant niche to be had there: among customers who don't care about GPU performance, intel can afford to practically give away their low end x86s, because of superior fab prowess. Among customers who do care about GPU performance, AMD has ATI GPUs of varying power coupled with CPUs that don't suck.

Re:Still no x86 license. (1)

mikael (484) | more than 3 years ago | (#34831822)

Nvidia have probably deliberately chosen not to get involved with fab production - that's not really their core business (chip design is, not chip manufacture). It's safer for them or the board manufacturers to lease multiple fab plants, just in case any one gets disrupted due to whatever reason. Leave that up to the board manufacturers and just give them a reference design.

Very Interesting Vis-a-vis AMD/ATI Aquisition (4, Interesting)

divide overflow (599608) | more than 3 years ago | (#34829814)

I can't help but wonder if this was primarily a fig leaf for Intel's licensing/acquisition of NVIDIA's GPU technology with which to compete with AMD and its acquisition and incorporation of ATI's graphics products within its own silicon. This may have advantages over the alternative of Intel making an offer to purchase all of NVIDIA.

No x86 or Chipset. (3, Insightful)

rahvin112 (446269) | more than 3 years ago | (#34829838)

Look like nvidia finally gave up on getting the x86 or chipset license. Guess the CEO is now going to bet the farm on ARM and Linux and think they can pull it off with closed source drivers! Either that or ARM windows which in my opinion will be DOA. Those patents where nVidia's best hope for an x86 license, Intel appears to have bargained with the bottom line being no x86.

Re:No x86 or Chipset. (2)

Junta (36770) | more than 3 years ago | (#34829864)

ARM windows which in my opinion will be DOA

But look at all the success they have had with the Windows editions for MIPS, Alpha, Itanium, and Power! (No, I don't count the kernel on xBox 360 in the same realm here).

ARM and Linux? (1)

nurb432 (527695) | more than 3 years ago | (#34830142)

Don't forget windows is now running on ARM..

Re:ARM and Linux? (1)

Tetsujin (103070) | more than 3 years ago | (#34830506)

Don't forget windows is now running on ARM..

I read this and picture Popeye, powered up from a fresh dose of canned spinach, suddenly flexing and (for reasons that only begin to make sense in terms of his current situation) a picture of an anthropomorphic window-pane with arms and legs appears on his bicep. And the window is running...

Re:No x86 or Chipset. (1)

Kjella (173770) | more than 3 years ago | (#34830224)

What good would an x86 chipset license do them? No matter what Intel has moved the GPU on the die, meaning the only thing nVidia could do is add cost by adding another mediocre GPU. Intel may or may not have enough legitimate grounds in that AMD is doing the same to win an anti-trust suit, but either way the best nVidia could hope for was cash and not their market back. nVidia knows long term it needs to find another niche as Intel's graphics suck less each generation and graphics cards are approaching overkill.

Re:No x86 or Chipset. (0)

Anonymous Coward | more than 3 years ago | (#34830634)

graphics cards are approaching overkill.

And CPUs haven't?!

The overkill is on the high-end. There's still room to improve on the low-end (power consumption). Offer people something better/cheaper than ION (Atom+9400) and there will be buyers.

Re:No x86 or Chipset. (0)

Anonymous Coward | more than 3 years ago | (#34830806)

the overkill isn't there for laptops, where you have to spend $1,500 minimum for a cheaply built high-specs gaming notebook. and that's probably with a 1.7GHz quad. hardly ideal. Sandy Bridge is a huge improvement for mobiles but still less than is desired.

Re:No x86 or Chipset. (1)

rahvin112 (446269) | more than 3 years ago | (#34830836)

A year or two ago the chipset business was nearly 1/3 of nVidia's business. When the i7 was introduced Intel refused to license it to nVidia and 1/3 of nvidia's revenue and profit died. The CEO of nVidia opened an anti-trust complaint and threatened to sue. It got really nasty. Without replacing that revenue there will be a very significant drop in nvidia's stock price and value.

Re:No x86 or Chipset. (1)

smash (1351) | more than 3 years ago | (#34830230)

You've overlooking the hundreds of millions of iphone/ipad like devices that are coming onto the market in the next few years. The PC market is saturated. For many people their hardware is plenty "good enough". Trying to compete/sell new machines in that space is going to be a lot more difficult than the phone/tablet market.

Re:No x86 or Chipset. (1)

Monkeedude1212 (1560403) | more than 3 years ago | (#34830402)

Unless of course they know something we don't.

Re:No x86 or Chipset. (1)

aztracker1 (702135) | more than 3 years ago | (#34830612)

I don't know why Nvidia doesn't buy VIA, or whomever has the necessary patents from Transmeta's portfolio...

Re:No x86 or Chipset. (1)

rahvin112 (446269) | more than 3 years ago | (#34830788)

The VIA license is non-transferable. In the event Via changes ownership the x86 license terminates. The only way VIA and nvidia could merge with Via retaining the license would be fore Via to buy nVidia.

Re:No x86 or Chipset. (1)

h4rr4r (612664) | more than 3 years ago | (#34830870)

Easy enough, Nvidia buys X shares of Via for $Y billion dollars. Via uses that money to buy Nvidia.

nVidia needs to die in a fire (0, Flamebait)

erroneus (253617) | more than 3 years ago | (#34829980)

I'm so thoroughly done with nVidia it is worse than the hate for an ex-wife.

They may or may not be the performance leader or whatever they think they are, but their abuse of Linux users is just too bad when compared to their only competitor, AMD/ATI. I'm about to order a new workstation and guess what? It'll have the top tier ATI graphics in it. nVidia just has no excuse for playing these 1990's proprietary crap games excluding Linux any more.

Die nVidia. Die in a fire.

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830008)

Along with intel, if I may add to the rant!

Re:nVidia needs to die in a fire (1, Funny)

JonySuede (1908576) | more than 3 years ago | (#34830074)

Die nVidia. Die in a fire.

that sounded like it came from a stereotypical bearded islamofascist

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830118)

Worse, a frothing Linux user!

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830128)

I'm so thoroughly done with Toyota it is worse than the hate for an ex-wife.

They may or may not be the performance leader or whatever they think they are, but their abuse of Mazda drivers is just too bad when compared to their competitor, GM. I'm about to order a new car and guess what? It'll have the top tier Mazda. Toyota just has no excuse for playing these 1990's proprietary crap games excluding Mazda any more.

Die Toyota. Die in a fire.

Re:nVidia needs to die in a fire (1)

Anonymous Coward | more than 3 years ago | (#34830190)

I'm so thoroughly done with my ex wife it is worse than the hate for toyota or nVidia

She may or may not be the performance leader or whatever she think she is, but her abuse of me is just too bad when compared to her competitors, hot 20 year olds. I'm about to order a new one and guess what? She'll have huge knockers. She just has no excuse for playing these crap games any more.

Die ex wife. Die in a fire.

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830284)

Bravo!

Re:nVidia needs to die in a fire (5, Insightful)

Locke2005 (849178) | more than 3 years ago | (#34830154)

Both ATI and nVidia suck. But it is far better to have the two of them competing with each other to at least pretend to be meeting their customers needs than to have one of them fail leaving us with only a single source for graphics chipsets. Would you really like to see AMD/ATI become the single video card vendor, complete with an AT&T "fuck you, we don't have to care, where else are you going to go?" attitude?

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34831982)

Whats your problem with Nvidia?

Are you really crying that they have binary drivers that actually work?

Did a card crap out on you that had the same chip material failures over multiple electronics industries?

Again, if it's about the driver:write your own or deal with it, it works.

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830162)

what? historically speaking nvidia has had very decent if not better support for linux.. I don't get it.

Re:nVidia needs to die in a fire (4, Insightful)

C0vardeAn0nim0 (232451) | more than 3 years ago | (#34830282)

get this: even if windows is better for some stuff, die hard zealots will stick to linux, it's about being open/free source.

ATI contributes code in the open, even if it sucks, it's preferable (for the die hards) than the better working but proprietary nVidia code.

Re:nVidia needs to die in a fire (4, Insightful)

Yosho (135835) | more than 3 years ago | (#34831018)

Get this: Linux users are a minority, and die-hard zealots are a minority in that minority.

Most of the people who buy video cards do so either for high-end industrial work or gaming, and the vast majority of those people use Windows and do not care whether their drivers are open source or not.

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830172)

Nice. So you're going to order the top of the line 6000 series ATI card so you can be free of Nvidia, right? Well I hope you know they only recently released open source drivers for the 6000 series and the performance is still terrible. The binary drivers that ATI has are still 10x faster than the open source ones... so enjoy NOT using your hardware to the fullest, because you would be contradicting yourself if you installed the horrible ATI binary drivers.

At least Nvidia's binary blobs work.

Re:nVidia needs to die in a fire (1)

erroneus (253617) | more than 3 years ago | (#34830458)

If only they would work on my Alienware M11xR2 with optimus hybrid graphics. They won't and they never will. They totally locked Linux users out.

Re:nVidia needs to die in a fire (4, Insightful)

Korin43 (881732) | more than 3 years ago | (#34830208)

What's wrong with nVidia? They don't provide open source drivers, but they do provide the *best* drivers for Linux. While I'd rather have good and open source drivers, good is a higher priority to me. I guess ATI has been getting better, but I've never had bad experiences with nVidia drivers.

And it's worth noting that they don't provide open source Windows drivers either and likely never will. Complaining because they don't do more for Linux users than they do for Windows users seems strange to me.

Re:nVidia needs to die in a fire (1)

ifiwereasculptor (1870574) | more than 3 years ago | (#34830210)

Maybe it's because I have an outdated card, but I don't get the hate. I thought Nvidia released good drivers for Linux and all. I'm a Linux user, I have a GeForce and performance seems comparable to that of Windows. Proprietary drivers, yes, but good ones. Am I missing something?

Re:nVidia needs to die in a fire (1)

sexconker (1179573) | more than 3 years ago | (#34830454)

Maybe it's because I have an outdated card, but I don't get the hate. I thought Nvidia released good drivers for Linux and all. I'm a Linux user, I have a GeForce and performance seems comparable to that of Windows. Proprietary drivers, yes, but good ones. Am I missing something?

Open source drivers from both camps suck ass.
Closed source drivers from AMD suck ass.
Closed source drivers from NVidia are competent.

Linux Neckbeard Warriors will never publicly support installing the closed source drivers, but every single one of them will do it.

Re:nVidia needs to die in a fire (1)

HJED (1304957) | more than 3 years ago | (#34830484)

Nvidia don't release drivers under GPL whilst ATI/AMD do, this upsets some of the die hard FOSS fans.
I haven't used the ATI drivers on linux so I can't comment on them, but I have also found that the Nvidia drivers provide the same performance as on windows.

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34832594)

I have used open-source AMD/ATI drivers on Linux. They are abysmal. They do just enough 3D to run desktop effects and glxgears, but any more advanced app is likely to crash, sometimes causing persistent display corruption requiring an X restart.

Note: by "advanced" here I mean using features from OpenGL 2.0 or later (about 7 years old, the current standard is 4.1).

I expect that the vast array of missing features and bugs will be reduced over time, and look forward to having usable drivers in a couple of years.

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830414)

"Die nVidia. Die in a fire."

but what about the 'invid flower of life' if they do die in a fire i have a feeling that the flower is just like in super mario bros 1. fireballs all that is left.

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830470)

I'm so thoroughly done with nVidia it is worse than the hate for an ex-wife.

They may or may not be the performance leader or whatever they think they are, but their abuse of Linux users is just too bad when compared to their only competitor, AMD/ATI. I'm about to order a new workstation and guess what? It'll have the top tier ATI graphics in it. nVidia just has no excuse for playing these 1990's proprietary crap games excluding Linux any more.

Die nVidia. Die in a fire.

"Allah akbar" much??

Re:nVidia needs to die in a fire (1)

erroneus (253617) | more than 3 years ago | (#34831026)

Shut up! I keel you!!

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830502)

"Waaah"

If Linux needs improvement, it's not in shitty-open-source, but in user-experience. The drivers tend to "just work" and give the best performance out of all three mainline vendors for OpenGL/CUDA/OpenCL. So why not go spend your time doing something useful, like improving existing open source, instead of just whining that you need more.

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34830626)

Agreed.

Screw you, Nvidia. You made an entire years' production line worth of bad GPU chips, sold them to various vendors, then took a huge charge from your figures in Q2 2008 to pay off the vendors for any 'repair' issues arising from what was manufactured defective. These contracts between yourself and the OEM were secret, and you never disclosed publically which production lines were considered faulty, instead refering all customer inquiries to their OEM. Now there's a bigass class-action lawsuit just now working its way through the courts and we'll all get '$20-off your next nvidia-gpu-equipped-laptop' certificates.

Also screw Hewlett-Packard, for knowing full well the scale of laptops this issue affected and yet charging your customers for out-of-standard-1-year-warranty repairs on graphics cards that had an average defective life of, you guessed it, one year. Also screw you for releasing a bios update which only turned the fan on full-speed and thus delayed the death of the GPU past the standard-warranty mark. This pissed off thousands of folks who expected your company to meet a standard of quality that it no longer can. You took Nvidia's money, then you took ours (Check out http://www.hplies.com for more about this folly if you care to).

So earlier this year a got a new laptop. ASUS G73 with an ATI 5870 inside. Damn I made the right decision this time around! Life is too short to get fuc

Re:nVidia needs to die in a fire (0)

Anonymous Coward | more than 3 years ago | (#34831864)

I have been using nVidia for something like 12 years now, almost solely on Linux too. Laptops, desktops, workstations, no big problems to speak of. I have tried a few of the alternatives over the years but always end up sticking with nVidia, they work the best with Linux, period.

Lets hope Intel and NVIDIA can end their fighting (0)

jonwil (467024) | more than 3 years ago | (#34830476)

Lets hope Intel and NVIDIA can end their fighting so that NVIDIA can make chipsets for the latest Intel CPUs again.

Re:Lets hope Intel and NVIDIA can end their fighti (1)

Macman408 (1308925) | more than 3 years ago | (#34832618)

Don't get your hopes up. Part of the agreement specifically amends the old chipset license to say that NVIDIA can't make chipsets for Sandy Bridge, Westmere, Nehalem, etc. chips that have a memory controller built-in. NVIDIA can make discrete graphics for these, of course, but the MCP line is D-E-D dead.

In related news... AMD CEO resigns! (3, Insightful)

IYagami (136831) | more than 3 years ago | (#34831086)

See http://www.amd.com/us/press-releases/Pages/amd-appts-seifert-2011jan10.aspx [amd.com]

Some very interesting analysis can be found at:
http://www.brightsideofnews.com/news/2011/1/10/coup-at-amd-dirk-meyer-pushed-out.aspx [brightsideofnews.com]
"Remember, Dirk Meyer’s three deadly sins were:

1) Failure to Execute: K8/Hammer/AMD64 was 18 months late, Barcelona was deliberately delayed by 9 months, original Bulldozer was scrapped and is running 22 months late -I personally think this is not true; Dirk Meyer was AMD's CEO from July 18, 2008 until January 10, 2011; he could not be responsible for K8 nor Barcelona, however Bulldozer...-
2) Giving the netbook market to Intel [AMD created the first netbook as a part of OLPC project] and long delays of Barcelona and Bulldozer architectures -this is interesting, after Intel has a serious failure with the Pentium 4, it's mobile division is the one who changes everything with Intel Core 2, designed from a mobile perspective-.
3) Completely missing the perspective on handheld space - selling Imageon to Qualcomm, Xilleon to BroadCom -I think this is the key; no one expected this market to be as successful as it is at the moment-"

bad news for us (1)

ILuvRamen (1026668) | more than 3 years ago | (#34832022)

This will help AMD because, to cover the costs, Intel has to raise their prices slightly. That means AMD can compete more in the cost vs performance battle so hurray for AMD, except you have to also realize that the customers get screwed. The only time AMD should do better is when they make better processors. THAT benefits us. When they do better without as much motivation to advance their processor performance, then things go downhill for the customers because they get a slower chip in the long run.

Re:bad news for us (1)

ustolemyname (1301665) | more than 3 years ago | (#34832220)

Well, the deal is $1.5 billion over five years. $300 million/year. Given that they bring in over 100 times that in revenue each year, I don't think that is going to be a big deal for them.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>