Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA To License Its GPU Tech

Soulskill posted about a year ago | from the sea-change dept.

Graphics 111

An anonymous reader writes "Today in a blog post, NVIDIA's General Counsel, David Shannon, announced that the company will begin licensing its GPU cores and patent portfolio to device makers. '[I]t's not practical to build silicon or systems to address every part of the expanding market. Adopting a new business approach will allow us to address the universe of devices.' He cites the 'explosion of Android devices' as one of the prime reasons for this decision. 'This opportunity simply didn't exist several years ago because there was really just one computing device – the PC. But the swirling universe of new computing devices provides new opportunities to license our GPU core or visual computing portfolio.' Shannon points out that NVIDIA did something similar with the CPU core used in the PlayStation 3, which was licensed to Sony. But mobile seems to be the big opportunity now: 'We'll start by licensing the GPU core based on the NVIDIA Kepler architecture, the world's most advanced, most efficient GPU. Its DX11, OpenGL 4.3, and GPGPU capabilities, along with vastly superior performance and efficiency, create a new class of licensable GPU cores. Through our efforts designing Tegra into mobile devices, we've gained valuable experience designing for the smallest power envelopes. As a result, Kepler can operate in a half-watt power envelope, making it scalable from smartphones to supercomputers.'"

cancel ×

111 comments

Sorry! There are no comments related to the filter you selected.

Translation: (4, Interesting)

SeaFox (739806) | about a year ago | (#44045733)

We want to transition to an IP company.
Then we only have to employ lawyers and executives, and save ourselves the trouble of all that making stuff.

Re:Translation: (5, Insightful)

Trepidity (597) | about a year ago | (#44045751)

Notice who gave the announcement?

NVIDIA's General Counsel, David Shannon, announced that...

Worked well for apple ... right? (4, Funny)

DavidClarkeHR (2769805) | about a year ago | (#44045881)

We want to transition to an IP company. Then we only have to employ lawyers and executives, and save ourselves the trouble of all that making stuff.

Nah, that's not why. They're following in the footsteps of Apple and Sega - license out your key strengths to strategic partners, and you're sure to succeed.

Right?!?

Re:Worked well for apple ... right? (2)

ozmanjusri (601766) | about a year ago | (#44047013)

Nah, that's not why. They're following in the footsteps of ARM - license out your key strengths to strategic partners, and you're sure to succeed.

Right?!?

FTFY.

]t's not practical to build silicon or systems to address every part of the expanding market.

citing the 'explosion of Android devices' as one of the prime reasons for this decision.

'This opportunity simply didn't exist several years ago because there was really just one computing device – the PC. But the swirling universe of new computing devices provides new opportunities to license our GPU core or visual computing portfolio.'

So breaking a long-held monopoly and opening a market to competition has lead to vastly increased opportunity and innovation. Gee, who'd have thought it?

Re:Translation: (5, Insightful)

amirulbahr (1216502) | about a year ago | (#44045983)

Yeah because designing a GPU is not really making stuff. A bit like how writing software is done by lawyers and executives.

This sounds like good news and an obvious step to me. It should lead to smaller and more energy efficient computing devices in the future.

Re:Translation: (2)

fuzzyfuzzyfungus (1223518) | about a year ago | (#44046097)

Yeah because designing a GPU is not really making stuff. A bit like how writing software is done by lawyers and executives.

This sounds like good news and an obvious step to me. It should lead to smaller and more energy efficient computing devices in the future.

I suspect that they also don't have too much of a choice: the cost and energy savings of die-level integration with the CPU are difficult to ignore(and, even if they were less impressive, AMD and Intel both have pet GPUs that they integrate into most of their cores, and can freeze out anything more tightly integrated than a PCIe device at their whim, as Intel indeed did when they changed Northbridge interfaces). Either Nvidia commits to building SoCs that are all things to all people(a rather tall order), or they allow existing SoC-spinners to choose a GPU architecture with rather punchier PC roots than some of the traditional low-power/embedded guys.

Re:Translation: (3, Insightful)

rahvin112 (446269) | about a year ago | (#44046211)

Either Nvidia commits to building SoCs that are all things to all people

That is what Project Denver was supposed to be. This announcement probably confirms that Project Denver is a failure that will never see the light of day. Denver was supposed to be the companies salvation after HPC, Tegra and everything else failed to meet the projections they set with wall street.

Re:Translation: (1)

hairyfeet (841228) | about a year ago | (#44046223)

Which is why I'm still scratching my head as to why when they did see how the wind was blowing they just didn't buy out Via. anybody see the benches for the nano duals? They weren't bad at all, and had several advantages that were just made for certain markets, like having baked in hardware crypto support, so WTF Nvidia?

Its not like you couldn't afford to buy a little pipsqueak like Via, you could then use your ARM cores to build some fucking awesome hybrids. Imagine laptops and tablets that could run full windows and Linux when plugged into the wall but when on battery you would switch into "uber battery" mode and get ARM battery life with none of the downsides...who wouldn't want to buy that? I want one of those, a tablet that gets all day battery but when I have some Windows program i need to run just plug in the main and there ya go, that would be fricking sweet and really innovative.

So I just don't get it, it seems like a perfect fit. Like AMD with ATI you have X86 cores that are "good enough" with killer graphics, if they would have done this when AMD bought ATI they could have had at least one of the new consoles running something like the Nvidia nano quad instead of being shut out.

Oh well, I'll just have to stick with AMD and hope the Apple chip dev comes out with something to replace bullsucker as the Am3+ chips ain't gonna last forever. Its a damned shame though as I bet Nvidia Tegra Nano would have been sweet.

Re:Translation: (2)

noh8rz10 (2716597) | about a year ago | (#44046401)

Imagine laptops and tablets that could run full windows and Linux when plugged into the wall but when on battery you would switch into "uber battery" mode and get ARM battery life with none of the downsides...who wouldn't want to buy that?

sounds like a crappy experience - a windows "laptop" that sucks when not plugged in? "no thanx" said the world!

Re:Translation: (0)

hairyfeet (841228) | about a year ago | (#44046575)

WTF are you babbling about? the biggest complaint about winRT is the fact it can't run Windows programs and this would solve that problem. or if you don't like Windows? Fine imagine an Ubuntu tablet that could run windows programs via WINE when plugged in and seamlessly switch to Ubuntu for ARM when you pulled it from the keyboard cradle, thus giving you ARM battery life...who wouldn't want that? Its a hell of a lot better than having to use those shitty cellphone "app" crap!

Re:Translation: (1)

noh8rz10 (2716597) | about a year ago | (#44046659)

look if i need a windows laptop then i'm going to get a windows laptop. if it's a POS normally and only becomes a windows laptop when plugged in, i might as well just get two devices or a desktop comptuter for that matter!

Re:Translation: (0)

Anonymous Coward | about a year ago | (#44047459)

the biggest complaint about windows RT is that the devices cost more than an android tablet or x86 windows8 laptop and cant do as much as either.

Re:Translation: (1)

symbolset (646467) | about a year ago | (#44046509)

Via's greatest value is to somebody else. So if nVidia tried to buy them they would just be outbid.

Intel can't touch Via right now (0)

Anonymous Coward | about a year ago | (#44047351)

The long arm of Intel's legal team can't reach VIA right now, but it would quickly devolve into a legal battle (which Intel would win) if Via's ownership was moved to NV which is not only in the same country as Intel but in fact both are headquarted in the same city (Santa Clara)

Re:Intel can't touch Via right now (1)

unixisc (2429386) | about a year ago | (#44051309)

Did Via ever make anything based on the AMD64 instruction set? What was the last status that existed legally either b/w Intel & Cyrix, or Intel & Centaur - both companies (Cyrix & Centaur) which were bought by Via? If Via had licensed the AMD64 instruction set from AMD, Intel would be out of luck there. In fact, I doubt that Via has to make CPUs that support both x86 and x64 - they can make them separately for each case.

Re:Intel can't touch Via right now (1)

hairyfeet (841228) | about a year ago | (#44051831)

Well according to the wiki the Via nano is 64 bit [wikipedia.org] and since all X86 64 bit support is based on AMD64 then it stands to reason that they have permission from AMD to use 64bit instructions. I also don't see how Intel could say shit since the way Centaur came up with their chips were NOT by copying the Intel design, as was originally the case with AMD who was a second source for early X86 chips, but by reverse engineering the Intel chips. In fact I remember this was an issue with the early C3 chips in that there reverse engineering didn't perfectly copy the Intel way of doing things so there was some corner cases where software that would work on an Intel or AMD wouldn't run on a Centaur.

But frankly Nvidia would only need the Nano, no reason to even bother with 32bit in this day and age except for a little backwards compatibility which with Android and ChromeOS is less of a problem. If you look at the specs i linked to or check out some benchmarks it would be a perfect fit for Nvidia, has performance like an Athlon but uses power more like a Bobcat, is 64bit, they have everything from singles to quads, and the baked in crypto would be a selling point in the server space.

I'm telling ya if you added a tegra on chip so it had the nicer GPU (they use the older Via Chrome GPU) along with the ARM cores so they could switch between ARM and X86 or even use both if required? You'd have a pretty sweet chip. You could even add a hardware assisted Virtualbox or WINE so that folks could have those one or two "must have" Windows programs while still having ARM mobility, as I said it would be truly innovative and I can see multiple places where that would be an easy sell.

Re:Translation: (1)

SuricouRaven (1897204) | about a year ago | (#44049021)

" like having baked in hardware crypto support"

I was very disappointed indeed to learn the Atom in my router does not have this.

Re:Translation: (1)

hairyfeet (841228) | about a year ago | (#44051639)

Dude if you want hardware crypto pick up one of the Via micro boards, they are less than $200 (sometimes less than $100 if you catch a sale) and have AES support baked in. I have used a couple for low power servers and they are awesome if you have to have encryption, crazy low power usage and thanks to the baked in crypto support I'd say it performs a little better than an Athlon of the same speed while using power more like a Bobcat. Oh and thanks to the fact they are often used in carputers you can get them in heatsink cases that are great for a router or a NAS using an external drive/s.

So check 'em out, I'm all for using the best tool for the job and for those that need decent crypto support while not cranking up the power and heat? they fit the bill VERY nicely.

Re:Translation: (1)

default luser (529332) | about a year ago | (#44050451)

Its not like you couldn't afford to buy a little pipsqueak like Via

Buy? VIA and Nvidia have similar revenue (4-5B/yr). The only thing that could happen would be a merger, and that would never fly because Jen-Hsun Huang would insist on running the combined company.

And there's no guarantee that the VIA x86 license would still be honored after a buyout. I believe that's one of the reasons nobody has made a move to purchase them.

Imagine laptops and tablets that could run full windows and Linux when plugged into the wall but when on battery you would switch into "uber battery" mode and get ARM battery life with none of the downsides...who wouldn't want to buy that?

I think you overestimate the demand for such a complex machine, and also forget that only since last year have you been capable of running Windows on ARM. The real solution has been to add all the ARM power optimizations to the Intel Atom, and now Haswell, so that you can get all-day battery life without compromise (and with no added hybrid system cost).

As for the value of the VIA Nano, it's reasonably impressive, but not suited for phones or tablets. A single core still uses at least 4w, but perhaps better process tech could help. A merger with Nvidia would have fixed ONE SERIOUS PROBLEM for VIA: their shitty chipets, and terrible IGP, and crappy drivers for both.

Re:Translation: (1)

unixisc (2429386) | about a year ago | (#44051257)

So it would be the converse of AMD/ATI - it would be nVidia buying Cyrix/Centaur??

Re:Translation: (1)

drinkypoo (153816) | about a year ago | (#44048731)

The question I have is why this is actually necessary. Is the market actually demanding to pair nvidia GPUs with crappy CPU cores? Because nVidia is already pairing them with good ones and offering SoCs, e.g. Tegra. Tegra has a metric assload of CPU, it's hard to imagine that they couldn't offer a dual-core and a quad-core version and cover the vast majority of cases.

Re:Translation: (2)

fuzzyfuzzyfungus (1223518) | about a year ago | (#44046081)

We want to transition to an IP company.
Then we only have to employ lawyers and executives, and save ourselves the trouble of all that making stuff.

Nvidia has been fabless since the beginning, the only difference with this announcement is that they'll sell you the ability to put their GPU on your die, rather than exclusively buying and reselling TSMC-fabbed GPUs of their design...

Re:Translation: (3, Insightful)

hairyfeet (841228) | about a year ago | (#44046105)

Actual translation "Intel fucked us in the ass more than AMD that at least got a billion plus for their ass reaming, all we got was the curb. Now we are just gonna have to become patent trolls because with AMD owning ATI and Intel going their own way we missed the boat...damn we should have bought Via".

What I want to know is...what in the hell does intel have on the DoJ to keep getting away with this shit? You had all the major OEMs saying they were taking kickbacks all through the P4 period, you have them shutting out Nvidia in the chipset market, which if that isn't classic antitrust I don't know WTF is, how do they keep getting away with this shit? Frankly what Intel has been doing has been worse than what got MSFT's balls in a sling (which I still think they should have split up MSFT) yet they seem to always walk away scott free, WTF?

Oh and for Nvidia fans...sorry but I could have told ya so. AMD saw how much these super insane-o monster chips were costing to make so they did the VERY smart move of developing the midrange chips and then simply adding a second chip for the high range, Nvidia kept the old way of building the uber-baddass chip and then figuring out how to cut it down, but doing it that way equals crazy hot chips that take them awhile to figure out how to selectively cripple their chips without totally trashing it whereas since AMD aims for the much more lucrative midrange market its easier for them to cover the spectrum of prices without breaking the bank.

Re:Translation: (4, Insightful)

Cassini2 (956052) | about a year ago | (#44046183)

Intel periodically cuts patent cross-licensing deals with AMD that have the side-effect of bailing AMD out financially. This keeps AMD around as a competitor.

If Intel adopted Apple's "thermonuclear war" attitude, AMD would have been out of business from the legal fees and injunctions long ago. However, if AMD was out of business, then Intel would be a monopoly and that would be bad for Intel.

Intel manages AMD, as best it can, such that AMD gets 20% market share, and no x86 profits to speak of. With "only" 80% market share, Intel gets to keep all of the profitable market segments, with no FTC and DOJ oversight. AMD is left appealing to those who want cheap CPUs.

Re:Translation: (4, Interesting)

hairyfeet (841228) | about a year ago | (#44046557)

Which is why I sell AMD exclusively, because I hate insider douchebaggery and Intel is king of the douchebags. And ya know what? I have yet to hear a single complaint that their system is too slow, not one. And I put my money where my mouth is, I have an AMD hexacore that just chews through games and transcoding (and does so at the same time if I want) and cost me at $105 shipped less than a Pentium Dual. So unless a person is in one of those rare fields where they need every possible cycle they can squeeze out of a machine they really are just pissing money away. And as a bonus the money I saved let me get a nicer gaming board, twice the memory i would have gotten otherwise, and plenty of upgrade options down the road if I want to go even faster or get the octo-core.

But that still don't explain why in the fuck Intel don't get busted, after all Apple and Linux were around when the DoJ busted MSFT's ass and if anything Intel has a tighter lock on the market than MSFT ever did. so I want to know who is cashing the checks, who is getting paid off, as i smell some dirty dealing which as we saw with the kickback scandal is SOP for Intel.

Re:Translation: (2)

Ash-Fox (726320) | about a year ago | (#44047873)

I have yet to hear a single complaint that their system is too slow, not one.

The AMD V105 in my Acer Aspire One is too slow. I've used older generation Atom CPUs that were faster.

So unless a person is in one of those rare fields where they need every possible cycle they can squeeze out of a machine they really are just pissing money away.

I usually don't buy AMD to avoid the repeated erratum issues [gmane.org] .

But that still don't explain why in the fuck Intel don't get busted,

They did.

Re:Translation: (1, Offtopic)

hairyfeet (841228) | about a year ago | (#44048025)

I'm sorry but according to Notebook check [notebookcheck.net] the chip you have is faster than the Atom it was originally made to compete against and has a faster CPU to boot. Before you take too much stock into what a particular benchmark says you might want to look up "Intel cripple compiler" to see how badly rigged anything compiled with the Intel compiler comes out, in one test they took a Via CPU and simply by changing the CPUID from "Centaur Hauls" to "Genuine Intel" they scored a full 30% HIGHER on the benchmark! Wow, amazing huh?

As for your specific situation I can tell you what the problem is AND how to fix it, the problem is...you bought an Acer. Sorry but Acer is pretty notorious for skimping in every way they can if you will look up "replace thermal paste" and that model you will find many Youtube videos showing you how to remove the shitty thermal pad they used and replacing it with something that works. I would suggest Arctic Silver, first pre-treat both the CPU and the heatsink to get the paste into the tiny imperfections, followed by putting a grain of rice sized drop on the CPU and reseating the heatsink. I have had to do that trick a few times on acers that end up in my shop because of "being slow" and you'd be amazed how much having an actual functional heatsink helps.

As for Intel getting busted...citation please? The only thing I heard of was a payoff to AMD to get them out of court, that is NOT "getting busted", that is bribery. and you name the Intel CPU and I can produce a nice long errata sheet for it as well, i don't think there has been a CPU since the 286 that didn't end up with a long errata list. I also frankly don't put much stock in an "errata" that is only being seen by some guy running an EXTREMELY niche OS and then ONLY when he compiles with a certain version of GCC. AMD offers their own compiler, based on GCC as a matter of fact, so why this dude doesn't use that? who the fuck knows.

Meanwhile I've sold hundreds of AMD units, both desktop and laptop, no complaints about speed. I was impressed enough that my entire family is on AMD, 5 desktops a laptop and I too have a netbook but unlike you I went with the better build quality Asus EEE and am VERY pleased by the performance, in fact after 3 years I still get over 4 hours on a battery watching movies, 5 websurfing, and I can slap an HDMI cable between my netbook and any TV and use it as an HTPC which I have done on several occasions, works great.

Re:Translation: (1)

Khyber (864651) | about a year ago | (#44051355)

" I would suggest Arctic Silver, first pre-treat both the CPU and the heatsink to get the paste into the tiny imperfections"

That isn't going to do you any good considering the ball size of the thermal compound components are larger than most imperfections on the surface of the heat sink and processor packaging (40 or so nm compared to the 15 or so nm of a heat sink.) This is why we're looking into carbon nanotube transfer pads, to fit into those very small imperfections and make for much better heat transfer and fewer air pockets to act as insulation.

"I also frankly don't put much stock in an "errata" that is only being seen by some guy running an EXTREMELY niche OS and then ONLY when he compiles with a certain version of GCC"

As I stated above, sounds like the n00b programmer just learned how to work with a crippled/biased year+ old compiler.

Re:Translation: (1)

hairyfeet (841228) | about a year ago | (#44051569)

While I would normally agree with you...dude we are talking about an Acer here, you ever seen how little polishing their heatsinks get? Trust me the amount of imperfections in one of their heatsinks you can easily get a nice thin layer of silver in the cracks. Plus its easy to tell when you have a heatsink that pretreat will help, it only takes a drop and an old CC and when you wipe it off if the heatsink has changed to a dull grey? Its got silver in the cracks.

And my little stalker troll can mod it down all he wants, if someone asks a legitimate question or posts a legitimate counter argument to which I have an answer? i'm gonna respond. he said the AMD V120 was slower than an Atom, I posted hard data showing that not only is it not slower the 4225 GPU it has is significantly better than what comes with any Atom netbook. I also pointed out that the Acer Aspire is pretty well known for having heat issues as well as how to solve them.

Finally as for the "errata" he linked to? One noob programmer with a variant of BSD has an issue with one version of GCC...which frankly he shouldn't be using in the first place as AMD has their own 100% free (in every sense) version of GCC that will optimize for their chips so using stock GCC is only slowing down his performance...yeah its a noob and I really wouldn't put too much stock in his not being able to compile correctly, he is probably "doin it wrong".

Re:Translation: (1)

drinkypoo (153816) | about a year ago | (#44048763)

I usually don't buy AMD to avoid the repeated erratum issues.

Intel errata lists are not only plenty long, but the probability of them increasing in size with bugs they didn't find and/or didn't want to admit initially is... well, let's just say I produced an FDIV error while trying to calculate it.

Re:Translation: (1)

Khyber (864651) | about a year ago | (#44051249)

* The bug still occurs if I place the MFENCE+NOP at the beginning
            of the function.

        * Placing the MFENCE+NOP at the end of the function causes the
            bug to stop occuring.

            The bug disappears completely in that case with testing over
            2 days.

        * Placing just a NOP at the end of the function causes the
            bug to stop occuring. As of this writing this is still undergoing
            longer tests.

Sounds like someone just learned how to program and actually use a year-plus old half-crippled compiler to me.

Wrong (1)

edxwelch (600979) | about a year ago | (#44049375)

This is a popular myth, that Intel must keep AMD around, otherwise it will be broken up by the goverment, or something.
In fact, there is nothing illegal about having a monopoly in itself. What is illegal is certain business practices carried out by monopolies.
Intel does not help AMD in any way. In fact, it wants 100% of x86 market. You can see this by the way it is now going after the lower end of the market with Silvermont based Celerons.

Re:Translation: (1)

Khyber (864651) | about a year ago | (#44051187)

"Intel periodically cuts patent cross-licensing deals with AMD that have the side-effect of bailing AMD out financially."

Good thing AMD won the entire console war this upcoming generation, because that means AMD isn't going to have much of a financial problem this next half a decade, at least.

"Intel manages AMD, as best it can, such that AMD gets 20% market share, and no x86 profits to speak of."

You might want to rethink that, considering the majority of Intel chips using x86-64 are using AMD 64-bit instructions and registers. AMD also has a much larger market share than that 20% you state overall right now when you include consoles, which are just restricted over-glorified computers.

"AMD is left appealing to those who want cheap CPUs."

Actually, they appeal to me because in an architecture-neutral compiled program AMD beats the shit out of Intel in multi-threaded workloads that don't require special instructions, at cheaper cost.

Re:Translation: (-1)

Anonymous Coward | about a year ago | (#44046253)

You know who else like to fuck people in the ass? Faggots, that's who.

Re:Translation: (1)

Joe_Dragon (2206452) | about a year ago | (#44046677)

back then apple could of had a real nice mac pro with dual AMD cpus and Nforce pro chipset so the mac pro 1 did not need to have that pci-e switchers all over the place with less pci-e lanes then the older G5 had.

Re:Translation: (0)

Anonymous Coward | about a year ago | (#44046843)

Yeah, Intel is the most eveil company in computing ever, perhaps the most evil one. They have more power on the computing landscape than IBM ever had (in the 70s). Yet they manage to avoid antitrust while IBM has lots of problems with justice 35-40 years ago. The list of anticompetitive Intel tactics would be longer than a Dostoievsky novel, so I shall stop here.

Why did IBM chose Intel in 1981 is anyone's guess (some claim that this is because they thought that such a pile of junk crapitecture had no chance of ever encroaching on theur high end,money making, core business; history has proven them wrong).

The 8088/8086 was by far the crappiest of the then first 16 bit processors (segmented addressing anyone). But Intel has learnt from IBM's history and always managed to weasel out of any antitrust investigations.

Bring the x86 monoculture down. Actually Intel has tried at least twice and failed, the irony being that the 64 bit version was defined by AMD and Intel had to follow suit because Microsoft told them that x86-64 was there and they were not going to support a different 64 bit x86 extension from Intel (that Intel had planned). For Intel, that was the first time they had follow AMD's (or anyone else's) lead and it was probably a tremendous shock, NIH syndrome and so on.

Re:Translation: (3, Interesting)

Kjella (173770) | about a year ago | (#44047055)

Actual translation "Intel fucked us in the ass more than AMD that at least got a billion plus for their ass reaming, all we got was the curb. Now we are just gonna have to become patent trolls because with AMD owning ATI and Intel going their own way we missed the boat...damn we should have bought Via". (...) Oh and for Nvidia fans...sorry but I could have told ya so. AMD [has been so much smarter]

Yes, because AMD has totally been flowers and sunshine ever since. In their Q1 2013 finances stockholder's equity was down to $415 million, one more total disaster quarter like Q4 2012 with a $473 million loss and they're filing for bankruptcy. Meanwhile nVidia's market cap is more than twice as big as AMD (and that is after AMD's stock recovered, it was 5x a little while there) and they're making money, this is not a back-against-the-wall move. It's the realization that building a complete SoC is complicated and just having good graphics is not enough, better to play the PowerVR game (who are not productless IP trolls) and be other SoCs than to be nowhere at all.

1 BILLION Dollars - thanks Intel! (1)

Anonymous Coward | about a year ago | (#44047375)

Actually Intel paid Nvidia over 1 billion [latimes.com] in a settlement two years ago. Also note that Nvidia has announced plan for building a new and impressive campus. I am going to guess that it cost substantially less than a billion dollars.

Part of Nvidia's agreement with Intel was to cease development of x86 compatible devices. Which explains the shift for Project Denver from x86 to ARM. And with ARM came partnerships with Google/Android and that ecosystem which has outlasted any Tegra deals Nvidia has attempts with Microsoft. (Microsoft Kin and Surface RT being two of the biggest flops with Tegra or really in general)

Re:Translation: (1)

Jah-Wren Ryel (80510) | about a year ago | (#44047599)

What I want to know is...what in the hell does intel have on the DoJ to keep getting away with this shit?

They build the CPUs that PRISM runs on, and the CPUs backdoor them a copy of everything hoovered up about DoJ employees.

Lets parse it a bit further (2)

Camael (1048726) | about a year ago | (#44046111)

David Shannon says:

PC sales are declining with the rise of smartphones and tablets.

Uh oh, our traditional PC market is dying.

High-definition screens are proliferating, showing up on most every machine. Android is increasingly pervasive. Yesterday’s PC industry, which produced several hundred million units a year, will soon become a computing-devices industry that produces many billions of units a year. And visual computing is at the epicenter of it all.

But wait! The mobile market is hot hot hot!

For chip-makers like NVIDIA that invent fundamental advances, this disruption provides an opening to expand our business model.

We should go all in on mobile and get some of that delicious moolah.

But it’s not practical to build silicon or systems to address every part of the expanding market. Adopting a new business approach will allow us to address the universe of devices.

How can we like, totally dominate this market?

So, our next step is to license our GPU cores and visual computing patent portfolio to device manufacturers to serve the needs of a large piece of the market.

Lets licence out our IP! You saw how it like, totally worked for ARM, right?

The reality is that we’ve done this in the past. We licensed an earlier GPU core to Sony for the Playstation 3. And we receive more than $250 million a year from Intel as a license fee for our visual computing patents.

We tried it in baby steps, and the money was delicious.

Now, the explosion of Android devices presents an unprecedented opportunity to accelerate this effort.

More money is good.

Re:Lets parse it a bit further (-1)

Anonymous Coward | about a year ago | (#44046743)

Man your reading comprehension really is shithouse.

Re:Lets parse it a bit further (1)

Camael (1048726) | about a year ago | (#44047127)

Read between the lines, AC. Or are you one of those who take at face value everything the PR departments spoon feed to you?

Re:Lets parse it a bit further (0)

Anonymous Coward | about a year ago | (#44047795)

No, I think he was commenting on the fact that you read english like some sort of retarded ADHD USian.

Re: Lets parse it a bit further (0)

Anonymous Coward | about a year ago | (#44047947)

I've read both the lines and what's between them.

In short, they tell me you've the mind of an 1980s used-car-salesman fueled by coke and broken dreams.

Re:Translation: (1)

exomondo (1725132) | about a year ago | (#44046549)

Not sure what you're talking about, sounds a lot more like Xerox (well PARC anyway) or ARM, both of which were damn good.

Re:Translation: (1)

fufufang (2603203) | about a year ago | (#44046953)

We want to transition to an IP company.
Then we only have to employ lawyers and executives, and save ourselves the trouble of all that making stuff.

Oi, ARM is an IP company too! Nobody seems to have any problem with it!

Re:Translation: (1)

TheRaven64 (641858) | about a year ago | (#44047523)

There are lots of IP companies that no one has a problem with. There are basically two business models for IP companies:
  • File or buy a load of patents and then, the next time someone independently invents something you've patented, ask for royalties and sue them if you don't get them.
  • Design things of value and sell the rights to use the designs to companies that would end up paying more if they developed something in house.

There are a load of companies in the second category that are very profitable and usually respected. It's the ones in the first category that give them all a bad name.

Re:Translation: (0)

Anonymous Coward | about a year ago | (#44047335)

Aren't all fabless silicon companies just a step away from being an IP company? How is NV significantly different from ARM Ltd or Imagination Tech?

Wow (-1, Flamebait)

binarylarry (1338699) | about a year ago | (#44045737)

Wouldn't it be great if they open sourced their fucking Linux drivers instead of playing the little game they play?

Think of how many licensees they could gain.

Re:Wow (1)

inflamed (1156277) | about a year ago | (#44045815)

I may be mistaken, but I anticipate that licensing the hardware will require sharing the associated driver sources with licensees. That seems like a step in the right direction, albeit for more profitable reasons.

Re:Wow (0)

Anonymous Coward | about a year ago | (#44046025)

Think of how many licensees they could gain.

None?

Re:Wow (5, Interesting)

symbolset (646467) | about a year ago | (#44046525)

nVidia's graphics drivers include proprietary and patented Microsoft technologies. They cannot open source them, ever. They made their deal with the devil and they have to live with it.

Re:Wow (2)

exomondo (1725132) | about a year ago | (#44046587)

Wouldn't it be great if they open sourced their fucking Linux drivers instead of playing the little game they play?

Why? What would be so good about that? AMD did it and it didn't do much for them.

Re:Wow (0)

Anonymous Coward | about a year ago | (#44049145)

Wouldn't it be great if they open sourced their fucking Linux drivers instead of playing the little game they play?

Why? What would be so good about that? AMD did it and it didn't do much for them.

Obviously, the linux drivers are the ONLY reason why AMD is behind NVidia.

Gentoo is the best (0)

Anonymous Coward | about a year ago | (#44047397)

What the fuck will you do with open source graphics drivers? Compile them in gentoo with some custom gcc options?
Bunch of god damn kids think that Debian and Stallman have it all figured out.

Plz license to Intel kthx (-1)

Anonymous Coward | about a year ago | (#44045955)

I would love to see the end of Intel embedded graphics.

Not only mobile (1)

NoNonAlphaCharsHere (2201864) | about a year ago | (#44046021)

I'm guessing the High Performance Computing guys might be interested as well.

Re:Not only mobile (1)

fuzzyfuzzyfungus (1223518) | about a year ago | (#44046129)

I'm guessing the High Performance Computing guys might be interested as well.

I'd imagine that it depends on how heavily current GPU/CPU compute systems lean on the 'CPU' side of the arrangement:

If the CPU actually keeps reasonably busy(either with aspects of the problem that aren't amenable to GPU work, or with assorted housekeeping tasks required to keep the GPUs fed and coordinated across the cluster), Intel or AMD offer pretty good prices for chips that provide a lot of PCIe lanes, support tons of RAM, and are supported by most of the world's horrid legacy software. Plus, motherboards and other supporting gear are brutally commodified, which is always nice.

If the CPU is mostly idle, and mostly gets included because it's the cheapest way to get a bunch of PCIe lanes and boot an OS that can run CUDA drivers and a NIC, then a Tesla-like card that includes a weedy little ARM core and can run on a simple backplane, without any PC server components, would seem like a logical thing to produce.

Re:Not only mobile (2)

Cassini2 (956052) | about a year ago | (#44046273)

For super-computing type workloads, ARM does not have a CPU fast enough to deliver the Ethernet, Infiniband, SSD, and other communications traffic to keep a Tesla fed with data.

However, Nvidia's long-term strategy must be to sell low-power and high-power ARM chips with GPU accelerators. Within 2 to 3 years, Intel will have a Xeon product that merges the existing 12-core Xeon processors with the 60-core Xeon Phi accelerators. Similarly, AMD will be building equivalent APU units with their mixed x86, ARM and GPU technologies. To be even marginally useful, Nvidia needs something to compete.

Personally, I think AMD stands a decent chance of having the fastest APUs. I think attempting to maintain cache-coherency between massive numbers of cores reduces the performance/watt advantage of the Xeon Phi. Also, if you are going to have heterogenous cores where the CPUs cannot run standard x86 code (like the Xeon Phi), then why not go fully heterogenous to maximize APU performance? Currently, AMD has the fastest merged processing units.

Re:Not only mobile (1)

dbIII (701233) | about a year ago | (#44046847)

It depends on the workload. Currently for a wide range of problems nothing can keep these things fed quickly enough for them to be able to finish the job before a normal CPU can. For other problems they finish an order of magnitude quicker than normal CPUs can. Memory usage is the main thing that separates the tasks that will or won't work on a GPU.
I know the above poster would be aware of this I'm just trying to simplify it for everyone else.

AMD (5, Interesting)

Guppy (12314) | about a year ago | (#44046053)

If you're wondering about AMD, they also had a project doing graphics for ARM CPUs, but it was outright sold-off to Qualcomm.

Qualcomm's "Adreno" GPU? The name is an anagram of Radeon.

Re:AMD (1)

socceroos (1374367) | about a year ago | (#44046199)

Huh, nice catch there, Sparky.

Aren't AMD getting back into the ARM+GPU game themselves now?

Re:AMD (1)

drinkypoo (153816) | about a year ago | (#44048711)

Qualcomm's "Adreno" GPU? The name is an anagram of Radeon.

That explains why the drivers blow so hard. With an assortment of tweaks [xda-developers.com] you can increase Adreno 205 performance by literally 50%.

Re:AMD (0)

Anonymous Coward | about a year ago | (#44049165)

Qualcomm's "Adreno" GPU? The name is an anagram of Radeon.

That explains why the drivers blow so hard. With an assortment of tweaks [xda-developers.com] you can increase Adreno 205 performance by literally 50%.

Congrats, you've just discovered an ATI (well, AMD) product. For a PC, their cards are built rock solid (and hot), but driver tweaks over the following 6-12 months will slowly increase the performance (and allow MOAR POWERZ or less heat). NVidia? Updates their drivers ... sometimes.

Re:AMD (1)

drinkypoo (153816) | about a year ago | (#44049521)

Congrats, you've just discovered an ATI (well, AMD) product. For a PC, their cards are built rock solid (and hot), but driver tweaks over the following 6-12 months will slowly increase the performance (and allow MOAR POWERZ or less heat). NVidia? Updates their drivers ... sometimes.

The problem is, this is a mobile GPU licensed out. For them it was fire-and-forget. And there never were any driver tweaks forthcoming from the OEM, only from "the community". Hooray for XDA-developers.

Intel Graphics (1)

Anonymous Coward | about a year ago | (#44046065)

Maybe now Intel can licence the tech and *finally* get a decent GPU in its chips.

So Intel is getting Nvidia GPU technology (5, Interesting)

Anonymous Coward | about a year ago | (#44046123)

The ONLY company on this planet with an interest in very high-end desktop class GPU technology for their own use is Intel. No-one else has the need (PowerVR fills the gap for most companies that license GPU designs) or the ability to build such a complex design into their own SoC.

Anyone else with an interest in Nvidia GPU capabilities would opt to buy discrete chips from Nvidia, or one of Nvidia's existing ARM SoC parts.

AMD is currently devastating Nvidia in the high end gaming market. Every one of the 3 new consoles uses AMD/ATI tech for the graphics. EA (the massive games developer) has announced their own games engines will be optimised ONLY on AMD CPU and GPUs (on Xbone, PS4 and PC). Nvidia is falling out of the game.

The x86 space is moving to APUs only. Chips that combine the CPU cluster with the GPU system. Intel's integrated GPU is pure garbage. However, Intel spends more on the R+D for its crap GPU than Nvidia and AMD combined. It would be insanely cheaper for Intel to simply license Nvidia's current and future designs. Doing so would give Intel parts that compete with AMD for the first time ever. Of course, it still wouldn't fix the problem that AMD tech is in the only hardware AAA games developers care about.

Next year AMD completes its project to take desktop x86 parts to full HSA and Huma (as seen in the PS4). Next year Intel begins the process to use this tech (and will be two years behind AMD at best). Both companies are moving to PC motherboards that solder memory and CPU on the board itself. Both are moving to a 256-bit memory interface, although again AMD will have a significant lead here.

Intel wants to copy AMD's GDDR5 memory interface (again, as seen in the PS4) but that requires a lot of tech Intel does not have, and cannot develop in-house (god only knows, they've tried). Nvidia also has massive expertise with GDDR5 memory interfaces, and the on-chip systems to exploit the incredible bandwidth this memory offers.

Everyone should know Intel wanted to buy Nvidia, but would not accept Nvidia's demand to have their people run the combined company. The top of Intel is BRAINDEAD, composed of the useless morons who claimed credit for the 'core' CPU design, when all core was in reality was a return to Pentium 3, after Netburst proved to be a horrible dead-end. This political power grab is responsible for all Intel's current problems, including this biggest disaster in semiconductor history- Larrabee. Intel's FinFET project has crashed twice (Ivybridge was much worse than Sandybridge, despite the shrink, and Haswell is worse again). Intel has no new desktop chips for 2014 as a consequence.

Now we can see it is likely Intel is readying Nvidia based parts for 2015 at the earliest. Intel has used licensed GPU tech before, notably the PowerVR architecture. However, Intel's utter inability to write or support drivers meant the PowerVR based chips wee a disaster for Intel. Intel's biggest problem with its current GPU design is NOT that it is a Larrabee scale failure, but that Intel is actually making headway. So why is this an issue?

Well companies like S3 also made successful headway with their own designs, but this didn't matter because they were way behind the competition at the time. It is NEVER a case of being better than you were before, but a question of being good enough to go up against the market leaders. Intel knows its progress means that internally its GPU team is being patted on the back and given more support, and yet this is a road to nowhere. Intel needs to bite the bullet, give up on its failed GPU projects, and buy in the best designs the market has to offer. Nvidia is this.

Unlike PowerVR, which is largely a take it or leave it design (which is why Intel got nowhere with PowerVR), Nvidia comes with software experts (for the Windows drivers) and chip making experts, to help integrate the Nvidia design with Intel's own CPU cores.

Re:So Intel is getting Nvidia GPU technology (1)

Nutria (679911) | about a year ago | (#44046229)

all Intel's current problems

With US$18Bn in cash and other marketable securities, sales of US$53.3Bn and net income of $11.0Bn, I'll take Intel's problems any day.

Re:So Intel is getting Nvidia GPU technology (5, Interesting)

rahvin112 (446269) | about a year ago | (#44046389)

Intel isn't going to buy or license nVidia stuff. They already have a license to use all their patents through a cross license deal that excluded a large chunk of Intel patents and IP. Intel is 100% focused on power consumption at this point and nVidia tech would do nothing but hurt them on this front. Haswell includes a GPU that's almost as good at the nVidia 650 and uses less power than Icy Bridge. It's also cheaper for the OEM/ODM's and provides better total power use.

It's trivially easy for Intel to just keep advancing the GPU with each processor generation. As people have been saying for years nVidia's biggest problem is that as Intel keeps raising the low end with integrated processors that don't suck they erode significant revenue from nVidia. The reason prices for top end nVidia parts keep going up is because they are continuing to lose margin on the middle end and have lost the low end. Better than half the computers sold no longer even include a discrete GPU. As Intel continues it's slow advance they will continue to eat more and more of the discrete market place. Considering the newest consoles are going to be only marginally better than the current consoles we're probably looking at another 7 years of gaming stagnation which in the long run will damage nVidia more as fewer games require more resources than integrated GPUs. I seriously doubt nVidia can go much higher than the current $1100 Titan and expect to sell anything at all. I expect over the next two years for nVidia to see consecutive quarterly declines in revenue. They've already eroded margin and they can't push price much higher.

They bet their lunch on HPC, and didn't even come close to their projections on sales. Then they bet the farm on Tegra, they sold none of Tegra1, had just short of no sales on Tegra2, did ok but only with tablets for Tegra3 and have announced not a single win for Tegra4. Project Denver was supposed to be the long term break with Intel that would provide the company the opportunity to move forward as a total service SOC company. Denver is supposed to be a custom designed 64bit ARM processor with integrated nVidia GPU. It was projected for the end of 2012. After missing 2012 they claimed end of 2013, this announcement makes be personally believe project denver has been canceled. Things haven't looked good for nVidia ever since Intel integrated GPU's and blocked them from the chipset market. They won't be selling to Intel because Intel doesn't want them. The other SOC vendors appear to be satisfied with PowerVR products (which focus on power use) except for qualacom which has the old AMD mobile cores to work with. I can't help but believe that this is as other have said, an attempt to go total IP and try to litigate a profit. This is probably the begining of a long slow slide into oblivion. nVidia's CEO has already sold most of his holdings (except for unexecuted options, also a very bad sign).

Re:So Intel is getting Nvidia GPU technology (1)

adolf (21054) | about a year ago | (#44047425)

Better than half the computers sold no longer even include a discrete GPU

Has it ever been the case in the past decade that more than half of the computers sold included a discrete GPU?

Once integrated graphics became a useable thing, the vast majority of systems that I see* do not have a dedicated graphics card: Integrated graphics of the day have always been adequate for any non-gaming usage of that same day, and people are (as a rule) cheap.

*: This is an anecdote based on a couple of decades of fixing computers. I welcome actual data.

Re:So Intel is getting Nvidia GPU technology (0)

Anonymous Coward | about a year ago | (#44047635)

I interpreted that as a discrete GPU chip (whether integrated on the mobo or on an external card) since that is the only way it makes sense to me. In the past mobos would often have a crappy GPU on the mobo, nowadays the GPU is often integrated with the CPU, therefore the GPU is no longer discrete processor.

Re:So Intel is getting Nvidia GPU technology (1)

lightknight (213164) | about a year ago | (#44048135)

Good luck with that -> the higher-end market will be keeping its discrete CPUs / GPUs, if only because it makes dual/quad-CPU/GPU designs easier. Or the fabled 'wall of processors' available in supercomputers.

On a side note, if AMD is willing to introduce multi-CPU capability into its high-end FX consumer line, I'd work to spec it into the machines I build. Dual latest edition FX CPUs...with 32 cores total or whatever...works for me.

Re:So Intel is getting Nvidia GPU technology (0)

Anonymous Coward | about a year ago | (#44046403)

You may be overestimating the consoles. Nvidia walked out from the Xbox360 talks if I remember correctly because they didn't want the contract for what was offered. This probably has something to do with the 10 year span in which you have to offer support for a certain type of GPU. Its a long time and its possible that somewhere along the lines, you start to earn less and less on it.
PCs and mobile platforms more fast, support for GPUs there really isn't needed that long in most cases. And Nvidia had a lot of success on the smartphone market.

Re:So Intel is getting Nvidia GPU technology (1)

symbolset (646467) | about a year ago | (#44046627)

Consoles are priced low and top tier console makers are looking at tens of millions of units at least, so they fight for every millicent and they have leverage. It probably didn't math out for nVidia or Intel to provide the CPUs (or for nVidia, GPUs) for this generation. The result is that we get a console generation that's a midrange PC with off-the-shelf AMD GPU. That means quick porting of games between consoles and PC and for the most part an end to exclusivity of titles. For the consumer that's a win. For the console maker's it's a club to bludgeon each other with. And that's a good thing too.

Sooner or later both consoles will be cracked, but at this price point that's unlikely to yield the kind of savings that cracking the PS3 did. That was remarkable tech on launch day.

Re:So Intel is getting Nvidia GPU technology (1)

Xest (935314) | about a year ago | (#44049903)

Yep, exactly. nVidia has rarely had much to do with the console market, so what the GP says has really nothing to do with AMD "trouncing" it and everything to do with nVidia avoiding unprofitable markets.

Nintendo has pretty much always been ATI iirc, and only the PS3 and original XBox have used nVidia. Certainly the Gamecube, the Wii, the Wii U, and the XBox 360 all used ATI. The PS2 used some home grown Sony thing.

Reading anything into the fact the new consoles are using AMD GPUs tells us nothing much given that AMD/ATI have held the majority of the console market for over a decade now but it's still not stopped their profits freefalling deep into the negative whilst nVidia has continued to retain healthy profit growth. It's certainly not done ATI/AMD any good being in that market, so why would nVidia care?

Re:So Intel is getting Nvidia GPU technology (0)

Anonymous Coward | about a year ago | (#44046891)

No next-gen console uses a "high-end" GPU, so AMD is selling a lot of mid-grade stuff, and good for them - they need the money.

With the discrete GPU PC scene, Nvidia still takes AMD at about 2:1. Games will be only optimized for AMD GPUs? Give me a break. If "next-gen" AMD GPUs are DX-compliant (anyone care to guess if they will be or not?), Nvidia will just follow suit and close the gap either by raw performance or with their TWIMTBP team. Nvidia has been and is still the enthusiast discrete GPU brand among PC gamers due to driver support, frame latency, tech advancements (adaptive vsync, FXAA, OGSSAA, etc), raw performance, etc.; if Nvidia is to fall into gaming obscurity, they need to fuck this up for themselves.

Re:So Intel is getting Nvidia GPU technology (1)

Anonymous Coward | about a year ago | (#44047107)

AMD is current devastating Nvidia in the hgh end gaming market

INwat universe? AMD are current anathema due to them STILL failing to fix their massive frame timing issues in multi-GPU setups, and significantly lower efficiency (performance/watt) for smaller or laptop machines.

And inclusion in consoes is not such a big 'win' as you might think. Console margins are razor thin. It may be a constant source of profit for the next few years, but that profit is not huge and reliant of continusing to improve processes the chips are buit on (to follow the pressure from console makers to reuce costs). Nvidia got out of that market because they've got the GPGPU market almost entirely to themselves. They don't need to sell low-margin glorified laptop GPUs to console makers for pittance.

AMD really need to shape up. Without competition pressure, there's not much to keep pushing nvidia forwards once they reach 'good enough' GPGPU performance. This has pretty much happened with AMD and Intel; the last fre Core generaions have had barely incremental performance increases, but brought the power consumption down to the level of AMDs lower-end CPUs. You now have the choice of a cheap AMD CPU with crap perforance, or a slightly more expensive Intel CPU with massively improved performance, with both having the same battery life.

This is great for Intel, but bad news for consumers. Similarly, unless AMD get their GPU house in order (and actually release a new architecture, rather than the currently scheduled rebadge of their existing one), it'll be good news for a complacent Nvidia and bad news for consumers.

Re:So Intel is getting Nvidia GPU technology (2)

Kjella (173770) | about a year ago | (#44047129)

Well companies like S3 also made successful headway with their own designs, but this didn't matter because they were way behind the competition at the time. It is NEVER a case of being better than you were before, but a question of being good enough to go up against the market leaders. Intel knows its progress means that internally its GPU team is being patted on the back and given more support, and yet this is a road to nowhere. Intel needs to bite the bullet, give up on its failed GPU projects, and buy in the best designs the market has to offer. Nvidia is this.

The Steam hardware survey [steampowered.com] seems to disagree, 14% of gamers are now happy running Intel chips so how many non-gamers do you think find them good enough? A GPU running as part of a CPU with a <100W total power budget is never going to compete with dual SLI/CF 200W+ discrete chips, both Intel and hardcore gamers know that. Intel just wants to be in mainstream products without AMD/nVidia getting discrete chip sales and they're succeeding, check any statistics for computers shipped with discrete graphics and they're in decline. Maybe it's an AMD APU, but most of the time it's an Intel.

Re:So Intel is getting Nvidia GPU technology (1)

Verunks (1000826) | about a year ago | (#44047503)

AMD is currently devastating Nvidia in the high end gaming market. Every one of the 3 new consoles uses AMD/ATI tech for the graphics. EA (the massive games developer) has announced their own games engines will be optimised ONLY on AMD CPU and GPUs (on Xbone, PS4 and PC). Nvidia is falling out of the game.

where exactly are you getting the number for amd devastating nvidia? both titan and 780 are better than amd gpus, the steam hardware survey [steampowered.com] still shows a 52% for nvidia and 33% for amd. The ps4 and the xbone will use amd because having a combined cpu and gpu is more convenient for a console and intel gpus are not exactly good.

As for ea do you understand that they got paid by amd to do that? this is something that both nvidia and amd have done for years, all those games that have a "better with nvidia/amd" intros didn't put it there because the developers like them, for example one of the main guy behind frostbite(that as you just said will only be optimized for amd) just built a titan system https://twitter.com/repi/status/346335279751237633 [twitter.com] , why would he use nvidia instead of amd if amd is devastating nvidia?

both amd and nvidia have strengths and weaknesses but nobody is devastating anyone

Also the console contract isn't great (2)

Sycraft-fu (314770) | about a year ago | (#44047625)

Consoles are focused on lowest possible cost of their hardware, since they sell to consumers at a loss, or at the best a slim profit. They need their suppliers to give them hardware for bottom dollar. That means you don't get much profit per unit.

Now that doesn't mean AMD is getting screwed, I'm sure they are making money per unit sold, but make no mistake: The reason they got the contracts is they could offer the lowest price and that means a thin profit. So 10 million chips sold in the console is less profit than 10 million sold in a desktop or server or the like.

It is not the grand prize of hardware contracts.

On another note I find it hilarious how fanboys relish in the concept of a competitor doing badly, as if we all wouldn't be more screwed if there was a single company. Personally, I like nVidia GPUs, they work better in my experience. However I'm real, real glad AMD is around. Why? Well if they weren't nVidia could, and would, charge more than they already do, and they wouldn't release new tech as fast.

So if you are an AMD fanboy wishing the death of Intel and nVidia, what you are really saying is "Gee I hope AMD will be able to overcharge me for lower end technology when they have nobody to push them!"

Re:So Intel is getting Nvidia GPU technology (1)

Waccoon (1186667) | about a year ago | (#44047683)

Sucks, because nVidia's drivers blow away AMD's. Radeon is a nice architecture, but after years of abuse I was completely fed up with stuff not working, and my new GTX is rock solid and runs everything I throw at it.

I'm talking about Windows drivers, of course, so little of this matters to the consoles and embedded developers.

Re:So Intel is getting Nvidia GPU technology (1)

drinkypoo (153816) | about a year ago | (#44048787)

Everyone should know Intel wanted to buy Nvidia, but would not accept Nvidia's demand to have their people run the combined company. The top of Intel is BRAINDEAD, composed of the useless morons who claimed credit for the 'core' CPU design, when all core was in reality was a return to Pentium 3, after Netburst proved to be a horrible dead-end.

They didn't just throw netburst away. Bits of it appeared in core, alongside the Pentium 3 technology.

Unlike PowerVR, which is largely a take it or leave it design (which is why Intel got nowhere with PowerVR), Nvidia comes with software experts (for the Windows drivers) and chip making experts, to help integrate the Nvidia design with Intel's own CPU cores.

The difference is that PowerVR is crap and has always been crap. I owned the Riva TNT, I owned the original PowerVR board, I owned the original 3dfx... I've owned examples of all (plus radeons) since and PowerVR is the biggest failure, their drivers are even worse than AMD's.

Re:So Intel is getting Nvidia GPU technology (1)

Xest (935314) | about a year ago | (#44050075)

You seem to be talking up AMD a lot and talking down Intel and nVidia.

Given your points and Intel's supposed major management failures and AMD's devastating of nVidia in the gaming market could you explain how Intel's $11bn of profit and nVidia's $0.5bn of profit factor into the equation against AMD's -$1bn of profit? Yes, AMD lost twice what nVidia made last year, and made $12bn less than Intel.

Something about your argument doesn't seem to stack up. If AMD was doing so well and Intel was so badly run and nVidia was getting devastated one might wonder why AMD's profits have been in the red for a long time now whilst nVidia's have continued to grow whilst Intel is raking in more profit than Google?

Similarly there's a reasonable question as to why both Intel and nVidia hold such a greater market share than AMD and why both Intel and nVidia's products are generally seen to be of higher quality than AMDs.

The reality is there's not really many metrics where AMD is doing well, they're not doing well financially, consumer confidence in their products is lower, and the only large contracts they do acquire are the low/zero profitability ones (like consoles). It'd be nice if they were as healthy and doing as well as you say, because it's nice to retain at least some plurality in any market to keep competition healthy, but it's just not the case right now sadly.

Drivers (0)

Anonymous Coward | about a year ago | (#44046155)

What does this mean for Nvidia drivers? Maybe slightly more documentation (or even source), or just more complex NDAs?

Re:Drivers (1)

symbolset (646467) | about a year ago | (#44046629)

Nothing. nVidia can't admit that Microsoft (or some Microsoft puppet) owns the source code to their Windows driver, patents on some of their graphics tech.

In other words... (0)

Anonymous Coward | about a year ago | (#44046167)

"PowerVR is kicking our ass in the mobile market and even Intel is doing a better job of transitioning their graphics tech in that direction. We also intentionally lost a lot of business by declining to license our GPUs for next-gen game consoles. We need all the help we can get to increase the rate of adoption of our GPU technology in mobile devices."

Re:In other words... (1)

symbolset (646467) | about a year ago | (#44046693)

Imagination Technologies, owners of PowerVR, recently became members of the Open Handset Alliance [imgtec.com] four months ago. Open Handset Alliance is the Android booster org. Before this they were a strictly proprietary driver company, and Android devices that used their tech had binary blobs. The binary blobs aren't gone yet, but they soon will be replaced with open source licensed drivers and actual hardware specifications.

So Microsoft needs there to be a mobile GPU tech company that has secret drivers to sell their mobile software on platforms that can't be made useful with a software flash. They cast about and set their sights on nVidia, who has already signed their devil's deal to keep how their PC hardware works a trade secret. They probably promised nVidia something useless to get this - that's their usual course. Now Microsoft's puppet hardware ODMs will build Microsoft nVidia GPU-based tablet platforms that can't run Android, won't sell, and have to be dumped all over the place like Surface RT is now. Expect Surface RT 2, whatever it's properly named, to use this tech. In the end nVidia gets hosed - again. If you sup with the devil, use a long spoon.

Intel used Imagination Tech in their Atom line as well, and that's why you can't get good Linux drivers for those otherwise sweet mini-itx boards. Yet. They're coming. Intel has dropped them though for some reason now in favor of in-house tech.

It's really hard to track the machinations in GPUs.

Moore's Law is killing Wintel (0)

Anonymous Coward | about a year ago | (#44046195)

Then why did nvidia go into Tegra?

I thought nvidia went into Tegra, so it could avoid licensing its IP out to get into the SoC market. Tegra must be losing money.

What is not being discussed, is that Moore's Law has changed the PC market over the last decade. Faster and cheaper transistors mean that the margins in software and hardware design are bigger now. Less skilled microprocessor design team, and less up to date processes, are now good enough. It also means that higher level programming languages produce good enough performance, instead of assembly, which means ease of switching instruction sets. That is very bad for intel's profit margins. With stronger hardware, higher level programming languages can also mean ease of switching operating systems/desktop environments. Bad for Microsoft.

Now, it is possible to use MIPS and Android to make a decent computing stack. Now, if some organization with lots of money, like say, CHINA, decided it wanted to kick Microsoft and Intel to the side, it could now do so.

Re:Moore's Law is killing Wintel (1)

Nutria (679911) | about a year ago | (#44046245)

Less skilled microprocessor design team, and less up to date processes, are now good enough. ... That is very bad for intel's profit margins.

Then why is AMD losing money hand over fist while Intel is raking in the cash?

Re:Moore's Law is killing Wintel (2)

symbolset (646467) | about a year ago | (#44046727)

Server side is where margins are at. AMD borked a server CPU generation and wound up cleaning house. If the world were different this would not be a recoverable error. Since Intel needs AMD to blunt monopoly supervision, Intel server tech will probably be delayed to give AMD a chance to catch up a little bit. Intel will keep inventing clever new stuff, but stuff it in a closet again as they have done many times before. This isn't a big deal since server tech is so way overpowered from what it needs to be that Intel could probably coast for 6 years before they had to start innovating again. Maybe they'll retask some engineers from servers (and God please, Itanic) to mobile. That would be nice.

Re:Moore's Law is killing Wintel (1)

MachineShedFred (621896) | about a year ago | (#44048541)

Intel server tech will probably be delayed to give AMD a chance to catch up a little bit.

Really?

Didn't Intel just kick new Xeon and Xeon Phi parts out the door, like, yesterday? [slashdot.org]

like arm does? (1)

strstr (539330) | about a year ago | (#44046209)

this sounds a lot like arm is doing right now. NVidia will sell the design of their GPU core, maybe some software, IP and other tech to other companies to make and design their own chips based on the NVidia architecture. the thing is, there is demand for high end GPUs, but for it to be reasonable to include NVidia's tech, they need more freedom to design and implement the hardware as they wish. this is going to be a completely new market for NVidia and will bring higher quality graphics and NVidia IP to more devices. this is a move toward collaborative computing, designing and sharing ideas and technology that will ultimately make the world a better, faster, higher quality, and more efficient place. plus NVidia otherwise has a strangle hold on quality graphics, this is their duty to help makes high quality graphics mainstream and easy to implement. no more need to reinvent the wheel and start from scratch every time I want to implement graphics or fight over my inability to design something worth using anymore! maybe we even move away from 2D and subpar for good now..

http://www.oregonstatehospital.net/ [oregonstatehospital.net]

WHERE IS THE F'ing CODE! (0)

Anonymous Coward | about a year ago | (#44046247)

I'm sick of hearing about NVIDIA when they won't release the code for anything....

Totally utterly worthless...

I don't care how good you support the product you sell when it's guaranteed to be dropped at some later date and then NOTHING works.

8S4it (-1)

Anonymous Coward | about a year ago | (#44046917)

'*BSD Sux0rs'. This as fiitingly and executes a Smitnh only serve

Nvidia and AMD (0)

Anonymous Coward | about a year ago | (#44047505)

I think this reeks of desperation. Nvidia has been dominating the GPU market for a long time and currently, imo, still has the a bit better portfolio when it comes to discrete GPUs. The difference is very small though, so you could argue if it even matters.

What I find really interesting that AMD seems to be on the winning track now. They have been the butt monkey of the industry for years.
AMD could never get the upper hand over Intel, only for short (but very important) years when Intel screwed up with the P4. ATi could never really get the upper hand over Nvidia, which is proven if you see how the market share for discrete GPUs looks and looked like in the past years (30ish% for AMD, the rest for Nvidia), if you go in the low end market, you have a lot of Intel as well.

AMD too over ATi, nothing really changed and many people even thought they might sink.

But then something happened: the minituristation happened and allowed to integrate GPU and CPU, it allowed the integrated GPUs to be worth something (in contrast to what Intel had done in the past years with that attempt) and AMD had, thanks to ATi, actually decent GPUs to integrate.
For the mass market, you need to be as cheap and simple as possible and suddenly, AMD can offer that, while Nvidia (no CPU) and Intel (no decent GPU) cannot.
It seems that AMD really picked the right track. This move here from Nvidia seems to indicate they attempt to change themselves for the future, but have no real idea how. If things go on as now, AMD now has like five years until Intel has a chance to catch up and Nvidia needs even more (they need to develop x86 experience AND experience with integrating it).
AMDs advantage in integrating CPU and GPU is getting bigger with every new generation of chip, as there is not much innovation in regard to the CPU is required, but the GPU-part is getting better and better. I think that both big NextGen Consoles picked AMD is not only proof of that thought, but only the beginning of a development that might lead to Nvidia being sold as scrap (to Intel?) at the end of the decade.

In all honesty: Who would have thought that like five years back?

Re:Nvidia and AMD (0)

Anonymous Coward | about a year ago | (#44048345)

NVIDIA needs to be patient. AMD will screw it up like they always do. Look at the FX series.... it looked good on paper and when it launched it's a power hog that couldn't keep up with the previous generation's high end chips on some benchmarks. I want to like AMD and I do buy a lot of their CPUs but the lack of OS support for their GPUs and APUs is preventing me from taking them seriously in the future. I also find their roadmap scary. They have no highend chips planned anytime soon. What happens when they get beaten in the mid range with no new pipeline to use?

Shooting oneself in the foot (1)

sociocapitalist (2471722) | about a year ago | (#44048577)

...and one week later they'll find themselves competing against a hundred Chinese brands that use exactly the same designs.

Wait, what? (1)

drinkypoo (153816) | about a year ago | (#44048697)

Shannon points out that NVIDIA did something similar with the CPU core used in the PlayStation 3, which was licensed to Sony

Really? NVIDIA licensed an AMD CPU core to Sony? Nifty.

I can't afford a Kepler compute card (1)

Richard Rankin (2956843) | about a year ago | (#44048909)

They've been selling only to Cray and the Chinese so their prices are through the roof. I could be developing departmental analytic machines, but apparently working on volume is a scary proposition.

Does this mean Tegra 5 is dead? (1)

edxwelch (600979) | about a year ago | (#44049577)

Think about it. If they license Kepler patents to a third party SoC developer, then that company will be directly competing against their own Tegra 5 chip. So, the only way it make sense is if they are canceling the Tegra 5 project.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>