Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Responds To Intel Suit

samzenpus posted more than 5 years ago | from the chip-wars dept.

The Courts 215

MojoKid writes "NVIDIA and Intel have always had an interesting relationship, consisting of a dash of mutual respect and a whole lot of under-the-collar disdain. And with situations such as these, it's easy to understand why. NVIDIA today has come forward with a response to a recent Intel court filing in which Intel alleges that the 'four-year-old chipset license agreement the companies signed does not extend to Intel's future generation CPUs with "integrated" memory controllers, such as Nehalem. NVIDIA CEO Jen-Hsun Huang, had this to say about the whole ordeal: 'We are confident that our license, as negotiated, applies. At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.'"

cancel ×

215 comments

Sorry! There are no comments related to the filter you selected.

Decaying CPU business? (5, Interesting)

Libertarian001 (453712) | more than 5 years ago | (#26913237)

WTF? Does Intel sell more CPUs than NVIDIA sells GPUs?

Re:Decaying CPU business? (5, Interesting)

0123456 (636235) | more than 5 years ago | (#26913299)

WTF? Does Intel sell more CPUs than NVIDIA sells GPUs?

Doesn't Intel sell more GPUs (admittedly crappy integrated ones) than Nvidia does?

I think they mean "decaying" margins (5, Interesting)

Anonymous Coward | more than 5 years ago | (#26913489)

By locking all competitors out of the chipset business, a company can boost margins (and thus boost profit), as opposed to living with decaying margins and lower profitability due to commoditization.

As standalone CPUs get commoditized, the margins and profitability decay.

Also if you sell crappy integrated GPUs, you can protect the GPUs from competition and the CPUs from commoditization by bundling them and locking out competitors.

Intel didn't get to where they are today by not knowing how to play the game. They wouldn't be walking away from their standalone CPU business and move to integrated CPU/GPU if they didn't think their old standalone CPU business would suffer from decaying margins. As they move into this space, it also only makes sense to try to put up barriers to your competitors who might be trying to screw up your future business strategy. Remember how Intel made AMD go try and execute "SlotA" when before they made pin-compatible chips. This is seems like a very similar strategy to try to kick Nvidia out of the Intel eco-system.

Re:I think they mean "decaying" margins (1)

GigaplexNZ (1233886) | more than 5 years ago | (#26913775)

They wouldn't be walking away from their standalone CPU business and move to integrated CPU/GPU if they didn't think their old standalone CPU business would suffer from decaying margins.

You seem to be asserting that they would only change business plans if the current plans are losing ground. This is not true. Companies are always looking for ways to make more money and could simply look for something with more potential even if their current approach is still going strong.

Re:I think they mean "decaying" margins (1)

scientus (1357317) | more than 5 years ago | (#26913793)

they dont have to think that CPUs will loose margins to want to go into GPUs, it could be just because there is a profit to be made by doing what they are doing, or maybe that they dont even intend to make a profit off their graphics, they just want to kick NVIDIA by reducing their volumes.

The other way around too (5, Interesting)

DrYak (748999) | more than 5 years ago | (#26914787)

While Intel is trying to lock nVidia and ATI/AMD out of the chipset business by bundling the CPU and the chipset and bridging them with an interconnect - QuickPath - which they won't license to nVidia,
nVidia on their hand has tried to do exactly the same, locking Intel and ATI/AMD out of the chipset business by bundling them with the GPU and bridging them with a technology that they won't sub-license either : nVidia's SLI.

nVidia has tried to be the only chipset in town able to do SLI.
Intel is currently trying to be the only chipset in town usable with Core 7i.

Meanwhile, I'm quite happy with ATI/AMD which use an open standard* which doesn't require licensing between the CPU and the chipset (HyperTransport) and another industry standard for multiple GPU requiring no special licensing (plain PCIe).

Thus any component on a Athlon/Phenom + 7x0 chipset + Radeon HD stack could be replaced with any other compatible component (although currently there aren't that many HT-powered CPU to pick from).

*: The plain simple normal HypterTransport is open. AMD has made proprietary extension for cache coherency in multi-socketed servers. But regular CPUs should work with plain HyperTransport too.

Re:Decaying CPU business? (5, Insightful)

Jthon (595383) | more than 5 years ago | (#26913505)

Define sell. If you mean bundle for virtually free with CPU's (or in some cases cheaper than just a CPU, go Monopoly) then yes they do.

If you mean as an actual product someone would intentionally seek out then Intel sells 0 GPUs.

In fact they count sales of chipsets with integrated graphics as a graphics sale for market share even if that computer also has a discrete graphics card. So if you buy something with an NVIDIA or ATI card and a 945G chipset that counts as graphics sale for Intel even though the graphics chip is never used.

Their integrated graphics actually benchmarks slower than Microsoft's Software DirectX10 implementation (running on a Core i7). If people were more aware of just how poorly Intel integrated chips were they'd probably sell even less.

Sadly, most people aren't aware of the vast difference in performance, and just assume their computer is slow when Aero, The Sims, Spore or Google Earth run poorly.

Until Intel ships Larrabee we won't really know if they can ship a GPU, and that looks to be still over a year away.

Re:Decaying CPU business? (5, Insightful)

_avs_007 (459738) | more than 5 years ago | (#26913577)

Not everybody particularly cares about 3D graphics performance. If you ask the common joe, they probably care more about video performance than 3D performance, as people typically watch videos on their PCs more often than play 3D games.

With that being said, Intel Integrated Graphics tend to do quite well with video, especially HD Video, rendering.

Somebody that cares about 3D graphics performance, because they want to play the latest and greatest games, is going to buy discrete graphics regardless, doesn't matter if the integrated graphics is made by nVidia, ATI, etc.

Re:Decaying CPU business? (1)

TheTurtlesMoves (1442727) | more than 5 years ago | (#26913679)

Not everybody particularly cares about 3D graphics performance. If you ask the common joe, they probably care more about video performance than 3D performance, as people typically watch videos on their PCs more often than play 3D games.

Yea, and then ask more clued up friends why a game they just got runs like crap. Just because they don't care does not mean they don't use it.

By the way, both desktop machines at my place have unused integrated gfx.

Re:Decaying CPU business? (1)

Klintus Fang (988910) | more than 5 years ago | (#26913735)

as I mentioned in another reply elsewhere though, it isn't really about the "average Joe". It's about business customers who buy PC's to put under their employee's desks. THAT is where the vast majority of desktop PC sales actually occur and that is the reason Intel is the leader. Businesses do not need or even want to have their employees using PCs in the office that are capable of playing games well. They want their employees using office apps and running email programs.

Re:Decaying CPU business? (3, Interesting)

Jthon (595383) | more than 5 years ago | (#26914131)

It's not just about games, there are business uses for GPU acceleration. Presentation software could use the GPU to be more dynamic, and render complicated graphs more smoothly. Some complicated PowerPoint presentations get slow, why not use a GPU to accelerate this?

Perhaps Excel or Matlab could use a GPU to crunch numbers to speed up calculations. Or even use the GPU to make the charts more interactive.

Perhaps MS has some overhaul to their display system which would allow it to use the GPU to render Word documents with better anti-aliasing and allow large documents to scroll faster. Adobe Acrobat actually supports some GPU acceleration (not on be default I think) which makes PDFs render faster. I know turning on PDF acceleration actually makes me more productive since I can read documents without having to wait for redraw.

Maybe we can do GPU accelerated vector graphics, for web site and UI rendering. Who knows what could be done to improve the business experience if the option is there.

NVIDIA expects to change the way people USE the GPU so it's NOT just for rendering 3D pictures anymore.

Some improvements to business experience might be small, but still give a small boost in productivity.

All that said, there will always be people who just use a very basic word processor. But these people also don't need Intel's next Core i7 quad mega CPU either. They'll be fine with their P2 running Window 95 if the hardware didn't eventually break down.

The whole point is that NVIDIA wants to innovate on the GPU so that business, and people can use it in new ways to do stuff they couldn't before. Intel wants to do the same, but require you to buy a bigger CPU. Instead you could get a cheap integrated GPU and CPU combo, and get the same productivity boost you were getting by buying just a bigger CPU before.

Re:Decaying CPU business? (4, Informative)

Jthon (595383) | more than 5 years ago | (#26913745)

Yea, and then ask more clued up friends why a game they just got runs like crap. Just because they don't care does not mean they don't use it.

That's exactly what I was getting at. I have friends who aren't die hard gamers who have no idea what a GPU is. But they still like to play games occasionally.

They go out and buy games like The Sims 2, Spore, or even World of Warcraft (yeah casual people play this) and get frustrated that it runs so poorly.

I hate to tell them that because they have a low end Intel integrated chip they're just screwed (especially friends with laptops where an upgrade is unheard of). Heck even the lowest end NVIDIA or ATI INTEGRATED chip is over 10 times faster than Intel, and honestly costs only a couple $ more.

Sure the NVIDIA/ATI integrated GPUs aren't top of the line, but at least with those the game is playable. I know someone who was trying to play some games on their Intel chipset and textures and some other affects are just missing.

Re:Decaying CPU business? (1)

TheThiefMaster (992038) | more than 5 years ago | (#26914299)

At least the "Intel Integrated" desktop PCs normally CAN be upgraded with a dedicated graphics card.

You should see Via's approach: "What graphics slot?"
A PCI nVidia 5600 was actually an upgrade...

Re:Decaying CPU business? (1)

_avs_007 (459738) | more than 5 years ago | (#26913797)

Not everybody plays games. My wife owns zero games, and has played zero games on her PC since I met her. Same with my parents. Same with my wife's parents. If you ask her what her priorities are when it comes to her PC, and 3D graphics/games rate very low.

Re:Decaying CPU business? (0)

Anonymous Coward | more than 5 years ago | (#26914117)

You can cite all the outliers you wish. The fact is that Gamers and porn have done more to drive the development of computer hardware and the internet than ANY other use.

Ain't it a shame.

Re:Decaying CPU business? (5, Interesting)

Jthon (595383) | more than 5 years ago | (#26914219)

There's more to GPU acceleration than gaming.

What does your wife do? Does she just send e-mail? Then beyond some UI improvements there's not much for her (but those UI improvements could be cool).

Does she encode music or video's for an iPod? That can be enhanced with the GPU. You can encode movies in faster than realtime on current GPUs. Something you can't do with current CPUs.

Does she watch YouTube? I saw a demo of a program that runs some fancy filters using the GPU on low quality YouTube like video, and spits out something that looks pretty good. It was something that couldn't be done in real time on a CPU but a mid to low range GPU could do.

Does she do graphic design? Features like the new Photoshop allow the program to be much more responsive when editing images, large filters also complete in fractions of a second.

In the simplest cases a better GPU might increase UI responsiveness, and make the experience "smoother". But long term changes will likely change WHAT you do with the GPU.

NVIDIA at least is trying to change it so GPU acceleration isn't just about gaming. They want the GPU to be a massively parallel processor that your desktop uses when it needs more processing power.

Re:Decaying CPU business? (1)

Jthon (595383) | more than 5 years ago | (#26913709)

True that people aren't looking for to play Crysis, but simple programs like Spore, and even Google Earth benefit from going to an NVIDIA or ATI integrated GPU. There's a visual quality upgrade on even these "casual" games as the Intel chips do such a poor job.

Also both the ATI and NVIDIA integrated chipsets do a better job of decoding HD video with less CPU usage. (Check out reviews of the ATI 790G and NVIDIA 9400M chipsets if you don't believe me.)

Plus with OpenCL we might start to see more regular applications accelerated on the GPU. Photoshop is accelerated when working with images, and both NVIDIA/ATI have programs which use the GPU to transcode video which people can use on their iPods, or share with family over the 'net.

I'm sure Apple has some cool ideas on how to use the GPU since they're heavily investing in OpenCL for Snow Leopard. That's probably partly why they went all NVIDIA on their notebook line.

Re:Decaying CPU business? (1)

_avs_007 (459738) | more than 5 years ago | (#26914067)

The 9300 and G45 were pretty neck and neck with regards to HD video decoding, so I don't know about the 9400.

However, the 790G was actually significantly slower at decoding HD video than both the 9300 and G45, according to Toms.

Re:Decaying CPU business? (1)

_avs_007 (459738) | more than 5 years ago | (#26914077)

just to clarify, I meant I have no idea about the performance of the 9400. I did not mean I doubted its performance...

Re:Decaying CPU business? (1)

Jthon (595383) | more than 5 years ago | (#26914303)

The 9400 and 9300 are pretty much the same but I think the 9300 has 16 shader cores, and the 9400 has 32. I'm not sure if that affects HD decode much. It does make a decent amount of difference for gaming, and CUDA apps I think.

Re:Decaying CPU business? (1)

DMalic (1118167) | more than 5 years ago | (#26914277)

Wrong. There's a ton of people who want to play one of the Sims game, maybe a recent Civilization title, an MMO, or a reasonable RTS (starcraft 2 is coming out and all). They have no need for something FAST, and the difference between Intel's graphics and everyone else is often "it works decently, not great" vs "it won't even run."

Re:Decaying CPU business? (1)

EGenius007 (1125395) | more than 5 years ago | (#26913593)

I'm currently on a hold for funding reasons, but I've intentionally sought out a motherboard with Intel integrated graphics [newegg.com] based on a review [phoronix.com] that suggests it would be suitable, and possibly the best low-budget option, for watching HDTV under MythTV.

Re:Decaying CPU business? (5, Informative)

Jthon (595383) | more than 5 years ago | (#26913645)

If you're looking for accelerated MPEG4 and HD video playback you won't find that on the Intel board. While they support XvMC fairly well that only does MPEG2.

Last month they released some drivers for the VA-API but that's in their closed source binary blob driver which works very poorly on Linux.

NVIDIA has VDPAU support which will already allow you to play back HD streams without having to fork over for a more expensive, and hotter running CPU.

Phoronix has several Articles about this:

http://www.phoronix.com/scan.php?page=article&item=xorg_vdpau_vaapi&num=1 [phoronix.com]

http://www.phoronix.com/scan.php?page=article&item=nvidia_vdpau&num=1 [phoronix.com]

Re:Decaying CPU business? (1)

EGenius007 (1125395) | more than 5 years ago | (#26913713)

The problem is that I'm hoping to build a low-profile PC. The entry point for low-profile NVidia video cards is fairly high. At least, that I've been able to find.

I don't have anything against NVidia products--the video card on my current system is the second NVidia board so far--but I did take offense to the blanket statement stating no one would ever have cause to look for a specific integrated graphics chipset.

Re:Decaying CPU business? (1)

Jthon (595383) | more than 5 years ago | (#26913765)

I didn't say integrated graphics was bad. I was saying INTEL's graphics are bad.

Go check out the 790G chipsets from AMD and the 9300/9400 chipsets from NVIDIA.

Both are integrated mainboards, but have much better 3D and HD decoding than what's offered by Intel, even in Linux. These will work for you in an low profile home theater PC, and do a better job of it :).

Re:Decaying CPU business? (1)

twitchingbug (701187) | more than 5 years ago | (#26913855)

Dude. It's not a question of who's better at what for what (tho generally I agree with you), but it's just the fact that someone wanted to buy intel integrated graphics, which directly refuted your original blanket statement. Which you chose to ignore in this post. That's all.

Re:Decaying CPU business? (1)

Jthon (595383) | more than 5 years ago | (#26913977)

My statement may have exaggerated a bit, but in general people seeking out Intel don't seem to be aware of NVIDIA or ATI's offerings. Both companies need to do a better job at marketing so people are aware that they have integrated offerings than Intel.

About the only place were NVIDIA fails is in open sourcing their drivers on Linux, but I haven't seen anyone cite this as their reason for choosing Intel yet. At least I can understand the reason someone would choose the more "open" platform even if it's performance is worse.

I can't understand why someone would not choose the product which offers better battery life and more features for about the same cost on closed platforms such as Windows.

But then again the 9400/9300 are pretty new for NVIDIA (previously no integrated graphics), and on the AMD side the 790G is still fairly new. So maybe people just haven't heard about these products.

Re:Decaying CPU business? (1)

EGenius007 (1125395) | more than 5 years ago | (#26914029)

Thanks for the advice, I'll be sure to check those out.

Re:Decaying CPU business? (2, Informative)

walshy007 (906710) | more than 5 years ago | (#26914503)

Also available is 9500GT in low profile form factor, have one here in my media pc, that's about as high end you can get with that form factor from what I've seen.

if you can wait, buy an ion (0)

Anonymous Coward | more than 5 years ago | (#26913785)

An Ion is about as small a standard form factor as you can get (pico-itx).
Once you look at the Ion for an HDTV platform, I don't think you'd go back looking at Intel's offering...

Re:if you can wait, buy an ion (1)

Jthon (595383) | more than 5 years ago | (#26913841)

An Ion is about as small a standard form factor as you can get (pico-itx).
Once you look at the Ion for an HDTV platform, I don't think you'd go back looking at Intel's offering...

That's the way to go in the future. The ION is the NVIDIA 9400M chipset (used by Apple in their laptops) but paired with an low wattage Intel Atom CPU. The entire thing is designed around a tiny pico-itx board, and draws very little power and can be passively cooled.

But due to the use of the NVIDIA ION chipset the package can decode HD video and run Vista Premium (if you wanted to) something you can't do on Intel's stock platform of Atom + 945G for a chipset.

Re:Decaying CPU business? (1)

mrchaotica (681592) | more than 5 years ago | (#26914683)

The problem is that I'm hoping to build a low-profile PC. The entry point for low-profile NVidia video cards is fairly high. At least, that I've been able to find.

Really? I bought a 6200 a while back -- back when a 6-series would have been a reasonable thing to buy -- and it was low-profile except that it had a full-height metal backing plate attached to it (I can't remember if there was a half-height one in the box or not). It was even passively-cooled, too. And I wasn't even looking for a low-profile card; it just happened to be on sale or something.

I bet you could find a 9200 (or whatever the current low-end Nvidia card is) in a low-profile form-factor without even trying.

Re:Decaying CPU business? (5, Insightful)

Klintus Fang (988910) | more than 5 years ago | (#26913715)

It is not about bundling. It is about the fact that the vast majority of PC sales are to business customers who want to put desktops under the desks of their employees and don't give a damn about the GPU performance. To those customers, spending the premium for an nVidia GPU is absurd. Hence, they buy inexpensive machines that have GPU's which suck at rendering 3D but are fully functional when it comes to running Office or Email applications. This, btw, is in my opinion the real reason AMD bought ATI. AMD wanted to work toward having a solution for that high volume market, and seemed to think they needed to own ATI to do it.

Many of the people who put together high end machines for gaming and/or other 3D application purposes---the people that buy and value what nVidia has to offer---frequently forget that type of machines they love are a very tiny percentage of the desktop market...

Decaying Matrox business? (2, Interesting)

Ostracus (1354233) | more than 5 years ago | (#26913771)

"This, btw, is in my opinion the real reason AMD bought ATI. AMD wanted to work toward having a solution for that high volume market, and seemed to think they needed to own ATI to do it."

I think you're partially right. If they indeed wanted entry into the business graphics market. Matrox would have been a better purchase. But ATI makes better GPUs and they wanted entry there as well. It's easier to scale down a high-end GPU than it is to raise up a low-end GPU.

Re:Decaying Matrox business? (1)

Jthon (595383) | more than 5 years ago | (#26913813)

If all AMD wanted was the low end business market they could have passed on ATI, and just licensed a cheapo core from someone like SiS, or they could have even acquired S3.

They probably wanted some of the chipset design expertise of the ATI side to create a "Centrino" like platform. That and they thought that the CPU will want to incorporate some of the parallel features of the GPU (Google their Fusion CPU project).

Re:Decaying CPU business? (0)

Anonymous Coward | more than 5 years ago | (#26914737)

A couple of years ago in my old company our standard purchase was a Dell GX270 with an nVidia MX 440 card. Why the graphics card? So as the integrated graphics wouldn't steal 64Mb of the 256Mb standard ram. I doubt that's an issue these days.

Re:Decaying CPU business? (1)

CarpetShark (865376) | more than 5 years ago | (#26914941)

This, btw, is in my opinion the real reason AMD bought ATI. AMD wanted to work toward having a solution for that high volume market, and seemed to think they needed to own ATI to do it.

Almost, but I think the real issue is that even traditional business desktops are beginning to need 3D, just for window compositing and "downloading..." animations. With Vista rating the entire computer based on the lowest score of a number of tests, and one of those tests being 3D performance, Intel were forced to up their game. Granted, Vista tanked, but probably not clearly before Intel made this decision (can't be bothered checking that though). Presumably Windows 7 does the same, and certainly OS X and now Linux need 3D, too.

DX10? That Vista thing? (0)

symbolset (646467) | more than 5 years ago | (#26913803)

Wake me when Microsoft gots somethin that runs on stuff somebody wants. Even I know Vista is the suckage.

No, don't. I really could pass a purple twinkie about what Microsoft thinks is good stuff even if they buy adds in my mags [adcentercommunity.com] that say it's good enough. If you want to get on my stuff then wise up.

Thankfully, Intel is hearing me [intel.com] , yo. Otherwise I'd be waiting like until Jasmine textes me back, which is like for ev-er.

Re:DX10? That Vista thing? (1)

Jthon (595383) | more than 5 years ago | (#26913879)

You know real time raytracing isn't something that has to be done on the CPU. There's no reason you can't write a raytracer for a GPU.

In fact NVIDIA already has a fully interactive raytracer. They demoed it last summer at NVISION, and SIGGRAPH '08. I'm sure as they expand CUDA support you'll see more and faster raytracers.

Go check out http://developer.nvidia.com/object/nvision08-IRT.html [nvidia.com]

Re:DX10? That Vista thing? (1)

symbolset (646467) | more than 5 years ago | (#26914013)

Dude. Even I know GPUs are optimised for compositing. Ray tracing is a way different thing. It has to have a way different system. Pretending it doesn't will not help you here.

Re:DX10? That Vista thing? (3, Insightful)

Jthon (595383) | more than 5 years ago | (#26914281)

Dude. Even I know GPUs are optimised for compositing. Ray tracing is a way different thing. It has to have a way different system. Pretending it doesn't will not help you here.

You didn't just write the above did you? You show your ignorance. A long time ago they did just compositing, but that was back in the VGA controller days.

Then they evolved to do fixed function rasterization, but those days are over (unless you're Intel doing integrated stuff).

GPUs are MUCH more programmable, and getting more so with each generation. You can do pretty much any floating point math function you want now. Go look up CUDA, and OpenCL they let you basically write C code for the GPU.

Sure the GPUs might not do so well when it comes to brancing, but you'll see that GPU's are being used to do more than just rasterization. Sure razterization would be an important target for NVIDIA/ATI but that doesn't mean it can only draw triangles.

If you look at the paper I linked (which you obviously didn't) it describes how they wrote a ray tracer using NVIDIA CUDA and EXISTING GPUs. If stuff gets more programmable as NVIDIA seems to be targeting, then it will only get easier to write ray tracers which run on the GPU.

If you want proof GPUs do more than rasterization go check out how NVIDIA's GPU tech is now in the Tsubame super computer.

Even Intel is getting into the GPU business with Larrabee, I bet they plan to write a ray tracer for that.

Re:Decaying CPU business? (4, Interesting)

GigaplexNZ (1233886) | more than 5 years ago | (#26913831)

If you mean as an actual product someone would intentionally seek out then Intel sells 0 GPUs.

I actively seek out Intel graphics when looking at laptops due to the lower power requirements and better driver support (I hate it when NVIDIA and ATI drivers don't install in Windows as I have to contact the OEM for an older version, and I've always had more issues with the same brands on Linux). I know the performance is abysmal in comparison, but I don't care. You don't want Intel graphics, that's fine and I understand why, but that doesn't mean no-one intentionally seeks them out.

Re:Decaying CPU business? (2, Interesting)

Jthon (595383) | more than 5 years ago | (#26913909)

NVIDIA has their laptop drivers on their website so you no longer have to get outdated ones from your OEM. (Took them long enough.)

As for battery life, have you checked out NVIDIA integrated vs Intel integrated? The discrete systems do suck more power, but I think the integrated chips for NVIDIA/ATI are still better and don't consume more power than Intel integrated.

Apple is picky about battery life, and they recently switched to all NVIDIA on their laptop line, including the Macbook Air.

Don't just assume that because it's NVIDIA it's a power hungry monster. Sure the high end graphics cards need their own power substation, but they can do some nice low power stuff when they need to (9400M, Tegra).

Re:Decaying CPU business? (2, Interesting)

GigaplexNZ (1233886) | more than 5 years ago | (#26914239)

NVIDIA has their laptop drivers on their website so you no longer have to get outdated ones from your OEM. (Took them long enough.)

Only for some models. My old 6600 Go (a very powerful laptop chip for its time) is still unsupported.

As for battery life, have you checked out NVIDIA integrated vs Intel integrated? The discrete systems do suck more power, but I think the integrated chips for NVIDIA/ATI are still better and don't consume more power than Intel integrated.

I have, and they aren't particularly appealing. Their performance isn't sufficiently better such that I can perform tasks that I otherwise wouldn't be able to, so the gains are effectively worthless to me. The driver support isn't fixed switching to NVIDIA/ATI integrated either (and is sometimes worse). Battery life is probably comparable, but it would need to be clearly superior for me to consider them.

Don't just assume that because it's NVIDIA it's a power hungry monster. Sure the high end graphics cards need their own power substation, but they can do some nice low power stuff when they need to (9400M, Tegra).

I don't assume that, but from what information I have gathered I feel the Intel chips are currently a better fit for my requirements.

Re:Decaying CPU business? (1)

Jthon (595383) | more than 5 years ago | (#26914359)

Only for some models. My old 6600 Go (a very powerful laptop chip for its time) is still unsupported.

That's annoying. I guess it looks like it's mostly just newer stuff up on there so far, and even not all their shipping products are supported.

I'll have the keep that in mind next time I go laptop shopping.

Re:Decaying CPU business? (1)

rbanffy (584143) | more than 5 years ago | (#26914765)

Since I don't run Aero, The Sims, Spore and only occasionally play with Google Earth, I don't really care for 3D performance.

On the other hand, since I only run Windows under VirtualBox (and don't play games under it - BTW, since when /. became a gamer site?), I do care about compatibility and Intel has given me, for the last couple years, the least headaches when it comes to 3D acceleration under Linux. While I would have to think hard and test a lot before buying a new computer with ATI or Nvidia graphics, I can always go for the Intel low-performance solution knowing it will be enough for me.

As for the money I don't spend on the GPU, I do it with added memory, redundant storage and so on, things that are important for my work and that more than once made up for the lacking 3D acceleration.

Larrabee will be sweet, but I won't carry it in my backpack anytime soon.

Intel GPUs are great (1)

jopsen (885607) | more than 5 years ago | (#26914785)

I only need a very little 3d graphics so intel is actually very good... I had an intel chipset in my old laptop and I never had driver issues of any kind...
In my new laptop I've got an ATI card, which I regret... I have nothing but problems with it... And there is NO open drivers for it... Which is in fact the only reason I went ATI and not nVidia or Intel.
And unless ATI starts actually delivering drivers for their mobile hd series then I'll be looking for intel for my next laptop, that's for sure...

Re:Decaying CPU business? (1)

Nursie (632944) | more than 5 years ago | (#26914821)

"If you mean as an actual product someone would intentionally seek out then Intel sells 0 GPUs."

False.

They have very good support for linux, to the extent that unless dual-boot and 3d games are your thing, they are pretty much the best option. Until AMD/ATI start making progress that is.

Also for business use they are cheap, reliable and a lot less power-hungry than the other two big players. For business desktop/workstation they make a lot of sense.

So? (1)

Moraelin (679338) | more than 5 years ago | (#26914921)

1. NVidia sells integrated GPUs too, and they too count crappy integrated GPUs as GPUs sold. And yes, even if you later go and buy an ATI 4870, Nvidia still counts it as a GPU sold.

So it seems to me like the GPs basic point still stands: Intel sells more GPUs than Nvidia. By a metric Nvidia too uses when they willy-wave about their market share being larger than ATI's.

2. You seem to assume that it's some inescapable misfortune for the users, or that that's somehow not included in the choice to buy this computer vs the other computer.

Newsflash: most people don't actually care about the GPU itself. They want a computer. And if they wanted a gaming rig that tops all benchmarks, there are enough companies selling them one. It's not like when they go to Dell's site there isn't a gaming computer category.

So, yes, the decision was made at some point to buy a computer which barely runs Aero well. Because they decided that they don't need more. And if an Intel integrated GPU was the cheapest there, I fail to see what's the problem.

Basically (for the mandatory bad car analogy) it's like when you buy a car, you don't actually give a flying fuck about the exact model of the gearbox under the hood. You might care about miles per gallon, price, whether your family fits in it, speed and acceleration maybe, insurance price, and/or the status-symbol value of that car brand. But if it's a Ford transmission or bought/licensed from Toyota, who cares? If they can save you some money by using transmission X instead of transmission Y, and the car still fits your criteria, why would you feel shafted? And if that saving came by getting it bundled with, say, the suspensions, again who cares?

Same here. If mom wanted a computer which runs Windows, does email and is good enough for her photoshopping photos taken with her digital camera, why would she care whether it's a discrete higjh-end GPU or an integrated solution from any of the manufacturers? The whole computer still does what she needs to do, and the latter costs less than the former. And if the one with Intel integrated chipset cost less than the one with Nvidia integrated chipset, so be it, that's the one she's going to buy.

Huang Knows His Stuff (1)

Louis Savain (65843) | more than 5 years ago | (#26913719)

Huang must have read this blog article Heralding the Impending Death of the CPU [blogspot.com] . Which is cool but Huang apparently declined to read this other article Parallel Computing: Both CPU and GPU Are Doomed [blogspot.com] , for obvious reasons.

Re:Huang Knows His Stuff (1)

GigaplexNZ (1233886) | more than 5 years ago | (#26913923)

That Parallel Computing: Both CPU and GPU Are Doomed [blogspot.com] article is somewhat confused over the definition of a CPU. They suggest that the CPU and GPU (plus other systems) would be replaced with a single chip that does everything. Doesn't that effectively fit the definition of a unit that processes information centrally (aka CPU)? Sure, the architecture may change, but it is still a CPU.

Re:Huang Knows His Stuff (3, Interesting)

Louis Savain (65843) | more than 5 years ago | (#26913979)

Not just any single chip but a homogeneous multicore processor, that is, one that has multiple processing cores. There is difference between a CPU and a multicore processor.

Re:Huang Knows His Stuff (1)

GigaplexNZ (1233886) | more than 5 years ago | (#26914253)

There is difference between a CPU and a multicore processor.

If that processor is the central processor, it doesn't matter how many cores it has or what its internal architecture is in my opinion. It is still a Central Processing Unit.

(I can't help but get the feeling that you are suggesting that current multicore processors such as the Core and i7 aren't actually CPUs as they also are homogeneous multicore processors)

Re:Huang Knows His Stuff (1)

thsths (31372) | more than 5 years ago | (#26914073)

Well, whichever way you look at it, it is going to happen. The future is a multi-core CPU with SIMD capabilities, that will outshine current GPUs and CPUs alike.

The main question is whether a cores are going to be identical, or whether there are dedicated cores for specific purposes. Same for the memory interface:unified or NUMA?But those are minor architectural issues that only fill in the details of the big picture.

Re:Decaying CPU business? (0, Offtopic)

randyleepublic (1286320) | more than 5 years ago | (#26914031)

testing My Cowboy Bug. Slow down my ass.

Re:Decaying CPU business? (1)

CarpetShark (865376) | more than 5 years ago | (#26914917)

Is the GPU a commodity yet? No.

What about the CPU? Probably.

But I do think nvidia are reaching a bit on this one.

Re:Decaying CPU business? (1)

Rennt (582550) | more than 5 years ago | (#26915117)

The CPU is dead. Netcraft confirms it.

Creative Labs? (2, Insightful)

DigiShaman (671371) | more than 5 years ago | (#26913359)

What's next, Creative starts bitching too because their APUs (Audio Processing Unit) are being snuffed out by nVidia and Intel?

Hey you two, STFU. Your technologies are forever joined at the hip in modern computing. Stop the bitch slapping and grow up.

Decaying CPU business? WTF? (1)

dark42 (1085797) | more than 5 years ago | (#26913363)

Then why is NVIDIA trying to build a CPU themselves, if it's a decaying business?

Re:Decaying CPU business? WTF? (1)

Tiro (19535) | more than 5 years ago | (#26913445)

Prestige, probably. Why do so many tycoons buy newspapers and airlines, despite their notoriety for losing cash?

Same reason.

Re:Decaying CPU business? WTF? (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26913889)

Because it gives them control of the entire platform. nVidia are the only people left with a GPU and chipset but no CPU of their own: Intel, AMD & Via have all three.

Re:Decaying CPU business? WTF? (1)

rbanffy (584143) | more than 5 years ago | (#26914895)

The desktop CPU market is being eaten away by the low-power notebook-friendly segment. The writing is on the wall since the Core Duos started appearing in Macs, with a low-power profile and a decent punch much better than any Pentium 4 of the time.

Nvidia is trying to build a CPU/GPU SoC-like thing that's very notebook-friendly and also could power business desktops, that traditionally don't require top-of-the-line performance, and can reap huge savings from power-efficiency.

Intel will fight them with all they have.

After all, companies did not replace CRTs with LCDs to free desktop space.

What's my line? (3, Funny)

Ostracus (1354233) | more than 5 years ago | (#26913397)

"At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.""

Sounds like he reads slashdot.

Re:What's my line? (4, Funny)

theheadlessrabbit (1022587) | more than 5 years ago | (#26913457)

"At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.""

Sounds like he reads slashdot.

it sounds more like he NEVER reads slashdot.
He said nothing about welcoming overlords, Natalie Portman, or hot grits. the phrase IANAL never came up, and no car analogies were used.
how does he sound like a slashdotter?

Re:What's my line? (1)

chaboud (231590) | more than 5 years ago | (#26913695)

Has it ever bothered anyone else that IANAL looks so much like "I anal," and is that an invitation?

I just noticed this today, oddly, but it's bugging the hell out of me.

More on the point, there was no racist ripping, no your/you're/yor or their/there/I-like-my-cat confusion, so, yeah, this isn't very slashdotty. That said, those things are the things that people put out when they *post* on slashdot. I'd wager that readership is 20:1, at least.

Re:What's my line? (0)

Anonymous Coward | more than 5 years ago | (#26913821)

IANAL, UANAL, we all ANAL.

even ur moms ANAL

Re:What's my line? (1, Funny)

Anonymous Coward | more than 5 years ago | (#26914119)

And it's buggering the hell out of me too.

Re:What's my line? (1)

Seth Kriticos (1227934) | more than 5 years ago | (#26914639)

Well, if you read the acronym correctly, then it says "I am not anal". Believe me, nobody on Slashdot wants to be anal.

*ducks*

Re:What's my line? (1, Funny)

Anonymous Coward | more than 5 years ago | (#26914977)

It took me a long time to figure out what IANAL meant and slashdot was a very scary place back then.

Re:What's my line? (0)

Anonymous Coward | more than 5 years ago | (#26913873)

I love it how this is marked informative. (:

Re:What's my line? (1)

mrchaotica (681592) | more than 5 years ago | (#26914775)

You must not read Slashdot either, because you completely missed the obvious meme of all:

"The CPU is dying -- Netcraft confirms it!"

Re:What's my line? (1)

jonaskoelker (922170) | more than 5 years ago | (#26914789)

In Soviet Russia, meme forgets you!

Re:What's my line? (0)

Anonymous Coward | more than 5 years ago | (#26914059)

"At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business."

No, this is clearly a attempt at spin, in the high style of NVidia's traditionally highly destructive habit of trash talking the very people that they most need to collaborate with. It is most endearing.

The fact of the matter is that Intel doesn't need NVidia. They are perfectly happy to soak up the bottom of the GPU market without NVidia cutting in. Furthermore, what Intel apparently has realized is that multi-/many-core is probably coming, but the jury is still out whether software developers will get their parallelism through (p)threads or OpenCL / DX11. If the former, then the CPU will likely be able to grow and continue to enjoy healthy margins. If the latter, then the CPU will probably stall out at 2 or 4 processors, and then Intel will be sitting there looking at their margins dry up due to the ugly side of Moore's law -- if the transistor count isn't going up exponentially, then the cost of the product is going down exponentially, and that would be deadly for Intel's margins. Thus, it is clear that Intel needs to diversify into the GPU side of things, specifically in the high margin part that can be expected to continue to support Intel in the style to which they've become accustomed.

NVidia, on the other hand, does need Intel. Intel is the gatekeeper to the x86 platform. (There is AMD too, but we can only imagine that they have no need for NVidia to supply graphics cores.) More often than not, the gating factor on GPU performance is the cost of getting data onto the GPU. Intel controls the bus technologies, and can decide whether to starve the GPU on a 1 channel PCI-E bus, provide QPI support or have seriously kinky sex with massive interchange of bodily fluids. In short, Intel and NVidia have found themselves sharing a milkshake and Intel gets to pick the straws. It appears Intel has given NVidia mighty small straw and told them to go suck it.

NVDIA's Plann (1)

dotslashdot (694478) | more than 5 years ago | (#26913451)

A GPU that replaces the CPU becomes the CPU. NVDIA created a GPU API that allows the GPU to be used for ANY computation and not be limited to graphics only. NVDIA is trying to become the CPU, or relegate the CPU to a very small role to grow NVDIA's business. They are trying to choke Intel by doing an end-run around the limits of their contract with Intel. As the GPU does more and more computation, it becomes indispensable.

Re:NVDIA's Plann (1)

Ostracus (1354233) | more than 5 years ago | (#26913527)

Well that'll never happen for the simple reason that CPUs and GPUs are different architectures geared towards different processing models. At best you'll have a GPU/CPU combination aka CPU/coprocessor. That's one of the reasons AMD bought ATI. For Intel to stay in this race they have to have a decent GPU to go with their CPUs. And Nvidia needs a good CPU to go with their GPU. Everyone's approaching this from their strengths towards their needs.

Re:NVDIA's Plann (1)

DMalic (1118167) | more than 5 years ago | (#26914301)

Huh? The whole point is their mutual attempt to reach a massively parallel GPU-like monster which can rip through all the code you can think of just like a CPU. The question is whether this crossbred abomination will originate as a mutated CPU or GPU.

Re:NVDIA's Plann (1)

rbanffy (584143) | more than 5 years ago | (#26914915)

"CPUs and GPUs are different architectures geared towards different processing models"

Perhaps.

But look at the SSE extensions to the x86 instruction set or the Cell: CPUs are becoming more GPU-like. If you could build a GPU with some CPU-like cores, you could do some serious damage in the future.

We are in for some entertaining developments.

And, since we have enjoyed a couple decades of x86 monotony, it's about time.

Re:NVDIA's Plann (1)

setagllib (753300) | more than 5 years ago | (#26913541)

I think you're way off on what CUDA and GPGPU in general really is. It lets you run math kernels on your video card. It does NOT allow it to be used for "any" computation in any practical sense. It may even be Turing complete, but it is so insanely impractical for general processing tasks, it won't replace the CPU, nor should it.

Similarly the CPU as we know it is not fit for handling GPU tasks. Intel is trying to solve that by optimising CPUs and cramming more of them onto a board to make an equivalent to a GPU. They are going to be infinitely more successful at this than nVidia could hope to be at replacing the CPU with anything remotely GPU-ish.

Jen-Hsun Huang (1)

macshit (157376) | more than 5 years ago | (#26913455)

Yeesh, does he ever say anything in public that doesn't sound like drug-addled desperate bluster...? He's like the Ken Kutaragi of the PC world...

nVidia doesnt have bargaining IP (1)

Rockoon (1252108) | more than 5 years ago | (#26913507)

What is at stake isnt just nVidia's chipset business, its their entire business.

They argue that the CPU is just the glue for highly parallel GPU operations, but Intel is planning to turn the CPU itself into a highly parallel monster needing no GPU at all... just some integrated display logic.

If nVidia loses its chipset market then they must drastically scale down their business.

I've said it before and I'll say it again... nVidia doesnt have anything to offer the other big players in the market in regards to licensing agreements. They have no essential IP to speak of. They are at a big disadvantage, even though they are currently king of the GPU's.

Typical bluster (4, Interesting)

CajunArson (465943) | more than 5 years ago | (#26913513)

Jen-Hsun Huang has never been one to keep his trap shut when given the chance... even though Nvidia is in the red right now. Lesson one: When a CEO comes out and tries to use a legal dispute related to a contract as a pulpit to make a religious sermon, he knows he's wrong. See Darl McBride and Hector Ruiz as other examples of dumbass CEO's who love to see themselves in magazines but don't want to be bothered with pesky details like turning a profit or actually competing.
    Intel is #1 in graphics when it comes to shipments... now I'm not saying I'd want to play 3D games on their chips, but guess what: despite what you see on Slashdot, very few users want to play these games. Further, I've got the crappy Intel integrated graphics on my laptop, and Kubuntu with KDE 4.2 is running quite well thanks to the 100% open source drivers that Intel has had it's own employees working on for several years. I'm not saying Intel graphics will play Crysis, but they do get the job done without binary blobs.
    Turning the tables on Huang, the real "fear" here is of Larrabee... this bad-boy is not going to even require "drivers" in the conventional sense, it will be an open stripped-down x86 chip designed for massive SIMD and parallelism... imagine what the Linux developers will be able to do with that not only in graphics but for GPGPU using OpenCL. Will it necessarily be faster than the top-end Nvidia chips? Probably not... but it could mean the end of Nvidia's proprietary driver blobs for most Linux users who can get good performance AND an open architecture... THAT is what scares Nvidia.

Re:Typical bluster (4, Interesting)

CajunArson (465943) | more than 5 years ago | (#26913537)

I hate to respond to myself but: Yeah the market share of Linux is not huge, Nvidia is probably not terrified of losing sales to Larrabee on some desktop Linux boxes (high end supercomputing apps could be an interesting niche they might care about though). However, it is afraid that OEMs will be interested in Larrabee as a discrete card where Intel never had a solution before. Given the problems that Nvidia has had with execution over the last year, and the fact that Intel knows how to keep suppliers happy, THAT is where Nvidia is really afraid.

Re:Typical bluster (1)

forkazoo (138186) | more than 5 years ago | (#26914883)

I hate to respond to myself but: Yeah the market share of Linux is not huge, Nvidia is probably not terrified of losing sales to Larrabee on some desktop Linux boxes (high end supercomputing apps could be an interesting niche they might care about though). However, it is afraid that OEMs will be interested in Larrabee as a discrete card where Intel never had a solution before. Given the problems that Nvidia has had with execution over the last year, and the fact that Intel knows how to keep suppliers happy, THAT is where Nvidia is really afraid.

Don't be too dismissive out of hand of the importance of Linux for nVidia. There are two distinct markets that involve linux. One is ordinary folks with GeForce cards. Linux gamers are pretty rare, but Linux users tend to be computer enthusiasts, so they tend to like having decent cards regardless. They run free software, so they aren't necessarily going to be spending tons of money, and they are smart enough to not buy the latest thing just to impress their friends. Then, you get group two. Call it the Hollywood Linux market. These guys have special nVidia CUDA boxes for their color grading systems, and they have crazy $5000 video cards with HDSDI outputs in their Flame boxes, and they get high end Quadros for running Maya. And add in a couple of CS researchers at universities building compute clusters from GPU's and whatnot. Group one involves a hell of a lot more people than group two and they probably can be ignored relatively safely. Group two, is tiny by comparison, most people will never see the hardware they use in person, but it's still pretty farking significant to the bottom line. It's all the people who used to use SGI hardware, and thus think a $5000 video card is cheap.

But yeah, if Larrabee lives up to the hype, and OEM's start to use it to displace GeForces, nVidia shits itself.

if...why (1)

Konster (252488) | more than 5 years ago | (#26913549)

If the CPU has runs its course, why is Nvidia suing Intel to get a slice of a dying technology?

Re:if...why (1)

Jthon (595383) | more than 5 years ago | (#26913621)

CPUs won't be going away, instead they're becoming less important and more of a commodity part. Which is what Intel is terrified of. Plus you'll still need something to run the OS and handle some IO while the GPU is crunching numbers.

Plus while the CPU might be dying (if you believe NVIDIA) we're not to the point you don't need one. Intel is the big player and if you want to ship products you need a platform to talk with the CPU.

Intel wants to throw hurdles in the way and delay NVIDIA until they launch their own GPU. If they didn't think this was where the market was going they probably wouldn't be spending so much effort on Larrabee.

if...why:x86 (1)

Ostracus (1354233) | more than 5 years ago | (#26913631)

Specifically why the "dying" x86 technology?

CPU a decaying business, yeah right... (1)

V!NCENT (1105021) | more than 5 years ago | (#26913573)

This is clearly an attempt to stifle innovation to protect a decaying CPU business.

So I can replace my CPU with a GPU the next time I buy a computer? Oh wait...

In five years nobody will need a powerfull GPU anymore. Run a real time ray trace benchmark on your PC and if you got a recent powerfull CPU (like me) you'll see why the scanline rendering is nearing it's end. At 1024x768 I get 60fps without shadows and 30fps with hard shadows. The benchmark wasn't even optimised for multithreading!

Re:CPU a decaying business, yeah right... (1)

batkiwi (137781) | more than 5 years ago | (#26914101)

OR everyone will have dual/quad core 1ghz atom-esque processors that cost $10 (and thus have no margins), and all compositing/video decode/specialist heavy lifting will be done by a "GPU/PPU" type chip.

Look at the nvidia ION for this exact situation... using a web browser doesn't need grunt. Working on a word doc doesn't need grunt. Watching an HD video does, and the CPU is horrible at it.

Re:CPU a decaying business, yeah right... (1)

SanityInAnarchy (655584) | more than 5 years ago | (#26914217)

Watching an HD video does, and the CPU is horrible at it.

Maybe relatively. It might even matter in a laptop...

On a modern dual-core 2.5 ghz CPU, I can play HD video fullscreen, 1080p, smoothly.

So the real question is whether manycore is really that much more expensive or wasteful than decent specialized chips.

Re:CPU a decaying business, yeah right... (1)

V!NCENT (1105021) | more than 5 years ago | (#26914505)

I predict that in five years GPU's will be on the same chip (AMD fusion, Intel larrabee) as the CPU. GPU's will be geared towards OpenGL (and Direct3D for what's still left of Microsoft Windows and believe me when I say Linux and Apple are going to kill Windows) compositing for 3D window management/Desktops. The GPU will be focussing strongly on image processing (ray tracing post processing and video) and high resolutions (HD stuff). CPU will in the beginning be used for ray tracing games. Later on when the 'GPU on CPU' method will be efficient enough for compositing and image processing VIA and other cheap GPU card manufacturor will go out of business (or at least the GPU part of the company in VIA's case will cease to exists or VIA will be bundeling their low power CPU with low power GPU's on one chip). Matrox will be able to get back in the picture with higher quality dedicated image processing, as they specialise in 2D graphics. Then ofcourse nVidia & Co will probably go into dedicated number crunching cards (CUDA stuff) for dedicated ray tracing, physics, fluid animation, etc. When that happens ray tracing engines will be heavier on the calculation front where the general purpose CPU doesn't have enough power to cope with it's lack of calculation power for HD resolutions and all the effects. Matrox will be out of the picture again because nobody is going to buy two dedicated cards as nVidia & co will also be offering post processing stuff on their number crunchers.

Re:CPU a decaying business, yeah right... (1)

V!NCENT (1105021) | more than 5 years ago | (#26914417)

As I was saying... *sigh*:

In five years nobody will need a powerfull GPU anymore

You realy think that all that stuff you're poiting out in your post can't be done by a cheap onboard GPU in 5 years?

That very onboard GPU might even be the the same chip as the CPU (fusions style CPU's ring a bell?)

This lawsuit doesn't pass the smell test (0)

Anonymous Coward | more than 5 years ago | (#26913601)

There's got to be some information withheld here. Why would Intel and Nvidia put their relationship at risk? Intel needs them for high-end graphics, and Nvidia needs them for the CPU.

This dispute seems self destructive to both parties and is only of benefit to ATI/AMD.

Re:This lawsuit doesn't pass the smell test (2, Interesting)

GigaplexNZ (1233886) | more than 5 years ago | (#26914425)

Intel needs them for high-end graphics

Not according to Larrabee. [wikipedia.org]

Nvidia needs them for the CPU

While that is mostly true, it isn't the whole story. They could solely rely on AMD CPUs (which could either cripple NVIDIA or boost AMD or both), or they could try to weasle around the patent issues and make their own CPU.

Funny, witty (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26913603)

farts

give me a 5

They're like babies. (0, Offtopic)

Khyber (864651) | more than 5 years ago | (#26914111)

Waaaahhhhhhh.

I wish some company would create a new chip that runs at a THz or something.

OpenCL? (1)

Shag (3737) | more than 5 years ago | (#26914183)

Intel, Nvidia and AMD helped Apple formulate the original proposal for OpenCL... Intel makes Apple's CPUs, Nvidia increasingly makes the GPUs (sometimes 2 in a single laptop). So there's bound to be some smack-talking about CPUs vs. GPUs and all that.

I think Apple will be the first to have OpenCL support in an OS, and as others follow suit and we see more CPUs and GPUs in machines, this little tiff might conceivably end up meaning... something.

Re:OpenCL? (1)

witherstaff (713820) | more than 5 years ago | (#26915071)

Having just seen a friend's presentation on the rise of GPU computing I'm looking forward to seeing OpenCL come out. Just on his test rig of a single Nvidia Tesla with 128 cores, a few hundred Gigaflops of number crunching was very interesting. He was also cheap and didn't throw in 3 more cards to get up to a few teraflops of processing. OpenCL could be a game changer for personal computers.

Drop the onboard parts (1)

WeeBit (961530) | more than 5 years ago | (#26914383)

The computer hardware manufacturers are better off getting rid of all of this integrated stuff, and concentrating on the real hardware. I believe it would be a big boost to the industry as well. Plus a step up in the right direction.

"Ouch and double ouch"? (2, Insightful)

seeker_1us (1203072) | more than 5 years ago | (#26914457)

More like strawman and double strawman. Jen-Hsun Huang talks about GPUs. Intel is talking about chipsets.

You can plug an NVIDIA GPU card into an Intel motherboard (I did just that for the computer I am using).

I have no idea why Intel wouldn't want Nvidia to make chipsets for core i7. For some reason, even years after AMD bought ATI, the only Intel mainboards which support two linked graphics cards do so through Crossfire. So if Nvidia doesn't make chipsets to support core i7, Intel would be forcing the hardcore gamers to either (a) buy AMD's video chips to use Crossfire or (b) buy AMD's CPU's to use NVidia SLI.

Deluded (1)

segedunum (883035) | more than 5 years ago | (#26914695)

I'm always uncomfortable when a CEO goes on a crusade like this:

At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.

Errrrrr, I think you'll find it's the other way aroud mate. That is, afterall, why you're maing comments like this?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>