×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia Claims Intel's Larrabee Is "a GPU From 2006"

Soulskill posted more than 5 years ago | from the dem's-fightin'-woids dept.

Graphics 278

Barence sends this excerpt from PC Pro: "Nvidia has delivered a scathing criticism of Intel's Larrabee, dismissing the multi-core CPU/GPU as wishful thinking — while admitting it needs to catch up with AMD's current Radeon graphics cards. 'Intel is not a stupid company,' conceded John Mottram, chief architect for the company's GT200 core. 'They've put a lot of people behind this, so clearly they believe it's viable. But the products on our roadmap are competitive to this thing as they've painted it. And the reality is going to fall short of the optimistic way they've painted it. As [blogger and CPU architect] Peter Glaskowsky said, the "large" Larrabee in 2010 will have roughly the same performance as a 2006 GPU from Nvidia or ATI.' Speaking ahead of the opening of the annual NVISION expo on Monday, he also admitted Nvidia 'underestimated ATI with respect to their product.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

278 comments

Doh of the Day (5, Insightful)

eddy (18759) | more than 5 years ago | (#24725623)

...he also admitted Nvidia 'underestimated ATI with respect to their product.'

Good, learn from that and don't make that same mistake again!

Larrabee [...] will have roughly the same performance as a 2006 GPU from Nvidia or ATI.'

DOH!

no wonder its slow (5, Funny)

Anonymous Coward | more than 5 years ago | (#24725627)

No wonder it's so slow. He keeps making reference to how it paints things. Can't move on to another frame until the previous one has dried.

Intel isn't aiming at gamers (5, Insightful)

Joce640k (829181) | more than 5 years ago | (#24725633)

So why is NVIDIA on the defensive?

Intel is aiming at number crunchers (note that their chip uses doubles, not floats). They don't want NVIDIA to steal that market with CUDA.

When Intel says "graphics", they mean movie studios, etc.

If Larrabee eventually turns into a competitor for NVIDIA, all well and good, but that's not their goal at the moment.

Re:Intel isn't aiming at gamers (5, Insightful)

Shinatosh (1143969) | more than 5 years ago | (#24725693)

Ok. Nvidia and AMD/ATI probably will overperform the Intel GPU. However Intel has open specs of their GPU-s, so for NOT the gamers there will be a quite good performance GPU to be used under Linux for various purposes with high quality OSS drivers. Im looking forward to it.
Shinatosh

Re:Intel isn't aiming at gamers (0, Insightful)

Anonymous Coward | more than 5 years ago | (#24725777)

You mean like ATI has open-sourced and open-specced most of its hardware?

Yes, I can see how that would give Intel a massive advantage...

Re:Intel isn't aiming at gamers (5, Insightful)

Nymz (905908) | more than 5 years ago | (#24725715)

So why is NVIDIA on the defensive?

I think the Nvidia people think pretty highly of themselves, rightfully so, and Intel has recently been making a number of bold claims, without backing them up. In a poker analogy, Nvidia is calling Intels bluff.

Re:Intel isn't aiming at gamers (2, Funny)

Zosden (1303873) | more than 5 years ago | (#24726105)

Are you new here, Slashdotter's don't use poker analogies. It's all about car analogies. It is like Nvidia is a Mclaren f1 and Intel is a Corvette, and the Corvette is trying to say it is better than the Mclaren.

Re:Intel isn't aiming at gamers (3, Interesting)

sortius_nod (1080919) | more than 5 years ago | (#24726393)

I wouldn't call it that.

I'd call it a knee-jerk reaction to a non-issue.

Nvidia are getting very scared now that ATi are beating them senseless. I run both ATi and Nvidia, so don't go down the "you're just a fanboy" angle either.

I've seen chip makers come and go, this is just another attempt by Nvidia to try and sure up support for their product, but this time they can't turn to ATi and say "look how crap their chips are" - they have to do it to Intel who are aiming the chips at corporate markets.

To be honest, the best bang for buck at the lower end of the market for 2D seems to be the Intel chips. One thing that does tend to surprise people is the complete lack of performance that the Nvidia chipsets have when not in 3D. ATi don't seem to have these problems having built around a solid base of 2D graphics engines in the 90's (Rage/RageII is at least one reason why people went with Macs back then). Nvidia is really feeling the pinch with ATi taking up the higher end of the market (pro-gear/high end HD) and intel suring up the lower end (GMA, etc). Nvidia pretty much are stuck with consumers buying their middle of the line gear (8600/9600).

When you aim high you tend to hurt real when you fall from grace, the whole 8800 to 9800 leap was abysmal at best unlike their main competitor who really pulled their finger out to release the 3xxx & 4xxx series.

All in all this seems like a bit of pork barrelling on Nvidia's part to detract from the complete lack of performance in their $1000 video card range. If anything this type of bullshit will be rewarded with a massive consumer (yes, geek and gamer) backlash.

I know my products, I know their limitations - I don't need some exec talking crap to tell me, and base level consumers will never read it.

Re:Intel isn't aiming at gamers (1)

Intelista (1187985) | more than 5 years ago | (#24725763)

I think you're right. Larrabee or some derivative may be good at graphics. It will almost certainly blow away current GPGPU solutions as workstation accelerators.

Re:Intel isn't aiming at gamers (2, Insightful)

Ant P. (974313) | more than 5 years ago | (#24725773)

So why is NVIDIA on the defensive?

They're substituting rhetoric for an actual competitive product. Right now they're crapping their pants because they gambled everything on Vista which is failing spectacularly, whereas both ATi and Intel have got a 2 year head start on supporting Ubuntu out of the box. You can say Ubuntu isn't linux, but it's what all those Dell buyers are going to see.

Re:Intel isn't aiming at gamers (2, Informative)

sammyF70 (1154563) | more than 5 years ago | (#24725887)

you should get your facts straight : ATI's linux drivers are atrocious (might have changed in the last 6 months or so, but I wouldn't bet on it) Between the two, only nVidia has halfway good drivers for their products.

Re:Intel isn't aiming at gamers (1)

ThePhilips (752041) | more than 5 years ago | (#24726109)

ATI recently released specs for their R600 chips.

Their driver might suck big time - as it's open source counter-part - yet in long term, ATI has huge advantage right now. In my eyes, sincere Linux support is huge advantage - though I game exclusively on Windows.

Needless to say, that dialog ATI had established with its Linux users and OSS developer community would also contribute positively to their proprietary drivers.

Both Intel and nVidia - proprietary driver companies - should be on defensive right now.

I'm pretty sure that whatever Intel is cooking up would be big - because many manufacturers do live off Intel created x86 eco-system. nVidia has always thrived to capture top of the market, often neglecting its normal users. But always remained very closed. ATI on another simply had no choice but to do something new and radical: so they supported Linux and OSS. Unlike nVidia, they also license CrossFire.

I think the GPU market battle is overhyped, yet I would gladly follow all the buzz.

Re:Intel isn't aiming at gamers (3, Interesting)

sammyF70 (1154563) | more than 5 years ago | (#24726349)

Yes. I know about ATI releasing the specs, which is why I said it might have gotten better now, though I guess it's going to be some time before we see anything happen (but it probably will)

Re:Intel isn't aiming at gamers (0)

Anonymous Coward | more than 5 years ago | (#24725923)

Uh, i guess you've never actually used Ubuntu because NVidia drivers traditionally perform much better than ATI's linux drivers.

Re:Intel isn't aiming at gamers (3, Insightful)

TheRaven64 (641858) | more than 5 years ago | (#24725979)

Uh, nVidia has supported Linux, FreeBSD and Solaris for some years now. The drivers are binary-only, which makes them unacceptable to some people (I'd prefer not to run them, personally), but if you're buying a Dell with Ubuntu or a Sun with Solaris on it you can easily use an nVidia GPU and get the same sort of performance you would from Windows.

Re:Intel isn't aiming at gamers (1)

the_womble (580291) | more than 5 years ago | (#24726167)

both ATi and Intel have got a 2 year head start on supporting Ubuntu out of the box. You can say Ubuntu isn't linux, but it's what all those Dell buyers are going to see.

I was not aware that there are specific "Ubuntu" drivers. Can you confirm that is what you mean - otherwise that last sentence does not make sense to me.

Support for binary drivers in Ubuntu (2, Informative)

DrYak (748999) | more than 5 years ago | (#24726573)

The nVidia drivers are binary only, so they are not available in the standard source repositories and are not compiled and included by default in most opensource distribution.

Ubuntu has made the necessary arrangement and provides, out-of-the-box a tool that can automatically download and install binary drivers from within the usual setup tool.

It think that's why the parent poster may refer to.

That means that, instead of having to manually download a package and execute it (from the command line) - which isn't complicated but require some interaction with the computer - installing a binary driver under Ubuntu simply means clicking the button "yes" on a dialog asking "the following hardware requires non-free proprietary driver, would you like to install them".
It's made trivial enough so computer non-litterate users can still do it easily - well, almost. The users still need to think that maybe they should get some software to make the graphics work better.

Behind the scene, cliquing "yes" automatically add the non-free drivers repository to apt-get and selects the necessary package for installation.

The results are similar (although differently implemented) to opensuse's one-click install (where you click on a link in a web page to a file with name ending in ".ymp") and the corresponding repositories are added to YaST and packages selected for installation.

Re:Intel isn't aiming at gamers (4, Insightful)

Kjella (173770) | more than 5 years ago | (#24726321)

zquote>Right now they're crapping their pants because they gambled everything on Vista which is failing spectacularly, whereas both ATi and Intel have got a 2 year head start on supporting Ubuntu out of the box.

Traditionally (before the AMD buyout) ATI has had terrible support under Linux, nVidia has been delivering their binary drivers and they have in my experience been easy to install, stable and fully functional much longer than ATI. By the way, the latest 177.68 drivers should now have fixed the recent KDE4 performance issues. From what I've understood the fglrx (closed source) ATI driver has made considerable progress so maybe now they're equal, but closed vs closed source nVidia got nothing to be ashamed of. For exampel, ATI just this month added CrossFire support while SLI has been supported since 2005, that's more like three years behind than two years ahead.

Of course, ATI is now opening up their specs but it's going slowly. For example, they have not yet recieved [phoronix.com] the 3D specs on the last generation R600 cards, much less the current generation. And after those specs are released some very non-trivial drivers must be written as well, meaning it could take another few years before we see fully functional open source drivers. Also this strategy is less than a year old, so if they're two years ahead they're moving fast. Nothing of this is something that should make nVidia executives the least bit queasy.

They are crapping their pants because ATI has delivered two kick-ass parts in the 4850 and 4870, and there's very little reason to buy anything else these days. They are crapping their pants because Intel won't give them a Nehalem system bus license. They're crapping their pants because the 800lb gorilla that's Intel is entering a market everyone else has been leaving. They're crapping their pants because the US economy is shaky and people might not spend big $$$ on graphics cards. But over Linux support? Put it into perspective, and you'll see it's an extremely minor issue.

Re:Intel isn't aiming at gamers (0)

Anonymous Coward | more than 5 years ago | (#24726505)

"whereas both ATi and Intel have got a 2 year head start on supporting Ubuntu out of the box."

Not to flame, but have you tried to setup a dual-monitor setup under Ubuntu with ATI vs Nvidia?

I tried two ATI cards unsuccessfully. The first was an ATI X1300 Pro AGP, the other was some older ATI Radeon 9200 or so PCI card. Neither worked. I tried everything I read, and I think the big issue was when you activated the ATI driver in Ubuntu X would quit working by being just a black screen.

So I purchased a Nvidia 6200LE AGP to try. Sure enough, enabling the nvidia driver worked flawlessly. To get dual monitor was easy too. Just had to apt-get the nvidia-config utility, run it, and that was that. The only hick-up with that is it wouldn't save to xorg.conf for some reason, so I just hit the PREVIEW button, and copy+pasted what it had to the xorg.conf manually. restarted X and it works perfectly.

Now there is one issue and I doubt it has to do the with nvidia driver though it may. I'm leaning more towards Xorg or some other item in Ubuntu, but if you're say watching a video on the 2nd monitor in VLC, and you double click for full screen, the video moves to the first monitor automatically and stays. If you double click to go back to windowed mode it goes back over to the 2nd monitor where it should have stayed the whole time.

Re:Intel isn't aiming at gamers (2, Insightful)

diegocgteleline.es (653730) | more than 5 years ago | (#24725961)

"movie studios"? Yeah, I'm sure Intel is putting a GPU in every Intel CPU (80% of the desktop market) just to make a couple of companies happy.

Why wouldn't Intel want to be a Nvidia's competitor? Intel has been the top seller of graphic chips, more than nvidia or ati, for some years. I'd say that they have been competing for a looong time now.

Intel has been very succesful with their integrated graphic chips because most of people on the world only need a chip that can draw Windows. Apparently, now they want to go beyond of that. Larrabee cant catch Nvidia, but it will be "fast enought" to become a important target for game developers. Nvidia always keep the "top-performance" tip of the market share, but that tip is becoming smaller and smaller.

Movie studios ... or anybody who uses 3D studio (1)

Joce640k (829181) | more than 5 years ago | (#24726061)

Artists need faster render times more than they faster on-screen interaction. Larrabee would be a good mixture for them.

Re:Movie studios ... or anybody who uses 3D studio (1)

jo42 (227475) | more than 5 years ago | (#24726639)

And what percentage of the customer base are artists? 0.00000000001?

Re:Intel isn't aiming at gamers (3, Informative)

Excors (807434) | more than 5 years ago | (#24726017)

Intel is aiming at number crunchers (note that their chip uses doubles, not floats).

That's not true. From their paper [intel.com]:

Larrabee gains its computational density from the 16-wide vector processing unit (VPU), which executes integer, single-precision float, and double-precision float instructions.

And it's definitely aimed largely at games: the paper gives performance studies of DirectX 9 rendering from Half Life 2, FEAR and Gears of War.

Re:Intel isn't aiming at gamers (3, Informative)

Anpheus (908711) | more than 5 years ago | (#24726591)

Did you not see the bit right after where you bolded text? ... double-precision float...

Re:Intel isn't aiming at gamers (3, Insightful)

Forkenhoppen (16574) | more than 5 years ago | (#24726049)

It's obvious that the graphics angle is really just a Trojan horse; they're using graphics as the reason to get it into the largest number of hands possible, but what they really want to do is to keep people writing for the X86 instruction set, rather than OpenCL, DirectX 11, or CUDA. Lock-in with the X86 instruction set has served them too well in the past.

In other words, general compute was an area where things were slipping out of their grasp; this is a means to shore things up.

It's a sound business strategy. But I have to agree with blogger-dude; I don't see them being overly competitive with NVIDIA and AMD's latest parts for rasterized graphics anytime soon.

NVidia seems to be more and more scared (1)

boorack (1345877) | more than 5 years ago | (#24726223)

Depending on how a real Larabee will work (compared to paper visions Intel shows today), it might render CUDA/StreamSDK efforts much less appealing. Even with a half (or a quarter) of performance of contemporary NV/ATI designs, it might be a strong competitor in general number crunching. And buliding it from x86-compatible cores is not so dumb move as it looks at first glance.

The major difference between Lafabee and contemporary GPUs is that Larabee is really fully programmable. It even supports multitasking and protection (it has paging system). It does not force programmer to use strictly data-parallel algorithms and does not make multipass algorithms so expensive (starting a task on Stream SDK is very costly - around 30ms or so, involvig X server and other unnecessary components on my Ubuntu box). Many algorithms (bitonic sort for example) are a joke on NV/ATI just because of a huge cost of starting subsequent stages of computation.

The only hope for NVidia and ATI in GPGPU area is making their devices more flexible and less pinned to traditional graphics processing, making them fully open and less dependent on X and proprietary drivers/extensions. I'm looking forward for fully open and programmable offerings from all three vendors, not for silly comments thrown at competitors.

Re:Intel isn't aiming at gamers (4, Insightful)

kripkenstein (913150) | more than 5 years ago | (#24726287)

No, Intel has been very clear that it is targeting games, even saying "we will win" about them, see this interview with an Intel VP [arstechnica.com].

NVidia is on the defensive for the simple reason that it needs to be. Not because Intel has a product that threatens NVidia, but because Intel is using classic vaporware strategies to undermine NVidia (and AMD/ATI). Intel is basically throwing around promises, and by virtue of its reputation a lot of people listen and believe those promises. With 'amazing Intel GPU technology just around the corner', some people might delay buying NVidia hardware. NVidia is trying to prevent that from happening.

Better than NVIDIA's proprietary hardware (2, Insightful)

Anonymous Coward | more than 5 years ago | (#24725655)

At least Intel documents their hardware. Fuck NVIDIA and their stupid proprietary hardware!

Glass

Re:Better than NVIDIA's proprietary hardware (2, Informative)

FoolsGold (1139759) | more than 5 years ago | (#24725789)

NVIDIA's proprietary hardware is what's capable of playing the games I want to play at the frame rate and quality I want.

For goodness sake already, why won't people stop being so ideological and just USE the damn hardware if it works better than the alternative. Pick what you need from a practical viewpoint, NOT on ideology. Life's not worth wasting one's efforts of the ideology of a fucking graphics chipset already!

Re:Better than NVIDIA's proprietary hardware (3, Interesting)

MrMr (219533) | more than 5 years ago | (#24725899)

Sorry, but I did exactly that, and got bitten recently: NVidia's drivers for old graphics cards lag behind more and more. I can no longer update one of my systems because the ABI version for their GLX doesn't get updated.
The fix would be trivial (just recompile the current version), but Nvidia clearly would rather sell me a new card.

Re:Better than NVIDIA's proprietary hardware (1)

FoolsGold (1139759) | more than 5 years ago | (#24725959)

I feel your pain, cos when I was running a desktop machine with an NVIDIA card I had problems of my own (eg. standby not resuming properly, occasional glitches, etc). However, what pisses me off is when people assume that open-source drivers such as the Intel drivers are somehow better. Shit, the Intel graphics drivers in Linux don't properly support all the GL extensions that the Windows drivers so, there's a documented but as of yet unfixed issue with the gamma levels when using XV for video playback, I've had a total lockup when switching from Compiz to metacity, playing a 3D game and turning Compiz back on, etc.

My point is that open source graphics drivers haven't shown much of an improvement (for me), apart from the advantage of them working out of the box in Ubuntu. I'd much rather take the proprietary drivers with superior support and quality, for the most part, if no alternative exists. It'd be lovely to have the NVIDIA drivers open sourced, but until they do, I'm hardly going to avoid them if they provide a superior experience.

Re:Better than NVIDIA's proprietary hardware (2, Informative)

owlman17 (871857) | more than 5 years ago | (#24726035)

I'm using a 71.86.04 driver (released last January 2008) for my Riva TNT2 on an old PII-366. Works pretty fine on a vanilla 2.4.x kernel.

Re:Better than NVIDIA's proprietary hardware??? (0)

Anonymous Coward | more than 5 years ago | (#24726293)

Works well indeed.

My kitchen computer has a Riva TNT2 and also works fine with latest vanillas.

BTW: Leave this enabled:
"Enable deprecated pci_find_* API" (CONFIG_PCI_LEGACY) as some nvidia drivers are still using it.

At least Intel documents their hardware. Fuck NVIDIA and their stupid proprietary hardware!

Although I see a lot of benefits if nvidia drivers were opensource, the truth is that most people that complains about it don't have the time or knowledge to make something useful with driver openness.

So, stop complaining and make proper bug reports when you dig into some problem.

Re:Better than NVIDIA's proprietary hardware??? (1)

MrMr (219533) | more than 5 years ago | (#24726701)

As I've used this particular system only linux for about 7 years with various distro's, I am well aware of the configuration issues. This bug has been filed months ago.
The Current workaround: don't update X11 until Nvidia updates their proprietary GLX library for the legacy drivers. This nicely illustrates the problem with closed source drivers.
There will come a time -not chosen by you, but by the manufacturer- when the hardware you bought will stop functioning correctly.

Re:Better than NVIDIA's proprietary hardware (1)

MrMr (219533) | more than 5 years ago | (#24726625)

Don't upgrade X.org or you'll be out of luck. But then, I don't think there is a distro with 2.4 kernels that even uses that fork.

Re:Better than NVIDIA's proprietary hardware (1)

Jeremy Erwin (2054) | more than 5 years ago | (#24725967)

Pick what you need from a practical viewpoint, NOT on ideology.

But Free software is practical. If it breaks, you can fix it. With closed source, you have to rely on someone else to fix it for you.

Re:Better than NVIDIA's proprietary hardware (2, Informative)

Anonymous Coward | more than 5 years ago | (#24726079)

And for those of us who don't code and do other things for a living, I guess we are just shit of of luck then?

I spend my money on hardware and drivers. In other words, I'm paying someone else to do it right.

Re:Better than NVIDIA's proprietary hardware (1)

fuzzyfuzzyfungus (1223518) | more than 5 years ago | (#24725993)

Circa 1980, MIT AI lab...

"Xerox's 9700 laser printer prints the documents I want to print with the speed and quality that I want them printed."

"For goodness sake already, why won't people stop being so ideological and just USE the damn hardware if it works better than the alternative. Pick what you need from a practical viewpoint, Not on ideology. Life's not worth wasting one's efforts of the ideology of a fucking laser printer already!"



History [faifzilla.org]. Remember, what you choose today might just change what you have the right to choose tomorrow.

Re:Better than NVIDIA's proprietary hardware (2, Insightful)

TheRaven64 (641858) | more than 5 years ago | (#24726011)

Using binary drivers makes it incredibly easy for old hardware to be orphaned. It's in nVidia's interest to encourage you to buy new hardware. They can do this by not supporting older drivers, and if you use something like Linux without a stable driver ABI (or even API) then you have a choice between using an old kernel or not using your GPU. They can introduce stability improvements in the old drivers that cause subtle performance degradation making you think it's time to upgrade. Or they might decide your OS isn't important enough to support anymore. How are they doing for Win2K support? How long will WinXP keep being supported now Vista is out? Linux may be fine, and so are Solaris and FreeBSD for nVidia support, but what about OpenBSD or Haiku? They can use DRI drivers, but not nVidia ones.

They can also leave security holes unpatched, like the issue a year or so back where you had a remote arbitrary code execution vulnerability in the driver by making it display pixmaps with certain characteristics (look at an image online and you're machine's compromised). If there's an issue like this in an old card, and they don't release a patch, then you can't safely use the card at all, except maybe in VESA mode.

Re:Better than NVIDIA's proprietary hardware (1)

slash.duncan (1103465) | more than 5 years ago | (#24726125)

Because proprietary does /not/ work better than the alternative, for some of us. Open does, because that's what xorg/gnome/kde support first, while the NVidia users of the community complain that the new software either doesn't work or is so slow as to be unusable. (See for example any of the recent xorg updates, where the proprietary drivers were holding back inclusion or stabilizing of the latest xorg, or KDE 4.1, where NVidia cards are simply unworkably slow for many users.)

That, and with the Dells and Asuses and Acers of the world now releasing and supporting computers with Linux installed, proprietary driver hardware just isn't as practical for them as open driver hardware. That's /got/ to be putting a pinch on NVidia right now, as they're now the only one refusing to cooperate with the community and provide at least specs, if not sponsor developers to provide open drivers.

Not everybody's so focused on games, you know. Some people want 3D and etc for the latest 3D desktop effects. While NVidia has arguably dominated the proprietary/gaming Linux market, they don't even have a horse in the open/desktop market yet, and their announced policy is that they don't intend to, either. The market changed and NVidia, like MS with the Internet and Intel with x86_64, was late to the game. Only MS and Intel both hugely dominated the market and had enough resources to survive the dry spell that resulted. NVidia neither dominates the market nor has the MS/Intel resources. While Linux is still low market share, it has taken advantage of the MS Vista misstep and the explosion of the netbook phenomenon and if Intel's projections are to be believed, that's a HUGE opening, that NVidia is all set to miss out on. Couple that with the GPU/CPU synergy that's all the rage now, and NVidia's looking pretty lonely out there by itself, missing two changes that could either one or both be as significant as the X86_64 thing has been.

Re:It's not ideological it's performance (0)

Anonymous Coward | more than 5 years ago | (#24726183)

>For goodness sake already, why won't people stop being
>so ideological and just USE the damn hardware

I agree. That's why I ditched Nvidia. Their damn proprietary drivers caused one too many kernel panics, and they didn't push out drivers in synch with the linux kernels that my distribution of choice, debian, did making me wait for Nvidia catch up. The last straw was when Nvidia declared MY hardware legacy and stopped bugfixing their crappy, yes CRAPPY drivers.

What's the definition of crappy? KERNEL PANIC. NO EXCUSES. For now I'll go with intel because they work WITH me NOT against me. When the xorg folks have stable 3D graphics drivers for AMD's latest batch of gpu's from AMD's documentation I'll give AMD another look.
 

Re:Better than NVIDIA's proprietary hardware (0, Flamebait)

PenguSven (988769) | more than 5 years ago | (#24725811)

NVIDIA provide a Linux driver, and if you sweaty linux types actually used your computer, rather than just continuously reading the lines of code that make it work while masturbating, you wouldn't really care if the driver source code is available or not.

Re:Better than NVIDIA's proprietary hardware (1)

ThePhilips (752041) | more than 5 years ago | (#24726155)

Intel? documents??

They have published few specs - only after number of on-line petitions and PR harassments.

As far as specs go, nVidia in some respect is less hated than Intel: later already has greater history of keeping everything confidential, sometimes not sharing even with partners.

That's of course different in markets Intel trying to enter right now e.g. telecom: they are very nice and polite, often sending updated specs to you even without asking.

But as desktop market concerned, make no mistake: Intel could be worst partner. As long as they have an edge over competitors, they wouldn't move a single finger to help OSS. Just like they btw actually do in desktop market.

Gee, How "Forward Thinking" of You, NVidia! (4, Interesting)

Kneo24 (688412) | more than 5 years ago | (#24725665)

"OH MY GOD! CPU AND GPU ON ONE DIE IS STOOOOOOOOPIIIIIDDDDDEDEDDDD!!!1111oneoneone"

How stupid is it really? So what if the average consumer actually knows very little about their PC. That doesn't necessarily mean it won't be put into a person's PC.

If they were really forward thinking, they could see it as an effort to bridge the gap between low-end PC's and high-end PC's. Now maybe, at some point in the future, people can do gaming a little better on those PC's.

Instead of games being nigh unplayable, are now running slightly more smoothly. With advance in this design, it could really work out better.

Sure, for the time being, I don't doubt that the obvious choice would be to have a discrete component solution for gaming. However, there might be a point where that isn't in the gamers best interests anymore. I'm not a soothsayer, I don't know.

Still, I can't only help but imagine how Intel's and AMD's ideas can only help everyone as a whole.

Re:Gee, How "Forward Thinking" of You, NVidia! (1)

Jeff DeMaagd (2015) | more than 5 years ago | (#24726027)

The one thing I see is that it's merging the two different product types that have different rates of advancement. I would think that might not be such a good idea. This might make the performance gulf between typical systems and gaming systems a lot larger.

Re:Gee, How "Forward Thinking" of You, NVidia! (0)

Anonymous Coward | more than 5 years ago | (#24726153)

The problem is the naysayers believe this is the next evolution of the graphics card. It is not. This is merely a solution for low end graphics - a platform which isn't updated as much, nVidia is still on the 8600 for low end stuff, which is two generations behind - it'll be perfect for family PCs, office work, etc. It is not designed for the gamer, that would simply not work. the heat load, the amount of silicon needed would make it a distant reality. The future of high performance graphics is the graphics card, not on die CPU/GPU.

Re:Gee, How "Forward Thinking" of You, NVidia! (3, Insightful)

Lonewolf666 (259450) | more than 5 years ago | (#24726175)

I think Fusion is an alternative to current integrated graphics, not to separate high-performance GPUs. After all it shares the drawback of having to steal bandwidth from the regular RAM, where discrete graphics cards have their own memory.

For a moderately power-hungry graphics chip (think 20 watt) the advantage is that Fusion can share the CPU cooler with a moderately power-hungry CPU, while integrating the GPU elsewhere on the board will require a separate cooler. That takes extra board area and money.
So I think Fusion might be able to perform somewhat better for the same price than other integrated graphics. Which means it threatens the widely used Intel boards with integrated graphics rather than NVidia.

To mention something else (but still mostly on topic):
In the Linux market, AMD is currently building a lot of goodwill by providing documentation and Open Source driver code. That might become an advantage over NVidia too.

Re:Gee, How "Forward Thinking" of You, NVidia! (0)

Anonymous Coward | more than 5 years ago | (#24726543)

> where discrete graphics cards have their own memory.

You forgot the important part: where discrete graphics cards have their own memory with at least ten times the bandwidth (but in some cases also ten times the latency) of the CPU memory.

Re:Gee, How "Forward Thinking" of You, NVidia! (1)

ThePhilips (752041) | more than 5 years ago | (#24726177)

Right now GPUs cannot be used widely by software because they are relatively expensive and support is sparse.

The point is to integrate the GPU functions deeper into system, allowing cheap low-end integrated boards to also have GPUs.

nVidia tries hard to keep the GPU acceleration exclusive to high-end. Intel and AMD/ATI want it to hit low-end - the market where most money are.

Classic case of disruption (4, Insightful)

propanol (1223344) | more than 5 years ago | (#24725755)

Ten years ago you would see Nvidia GPUs in everything from low- to high-end. Today, not so much - Intel dominates the low-end spectrum, with ATI hanging onto a somewhat insignificant market share. The Larrabee is Intel moving upmarket. Sure, it might not perform as well the latest Nvidia or ATI high-end GPU but it might be enough in terms of performance or have other benefits (better OSS support) to win some of Nvidia's current market share over. Considering it's supposedly the Pentium architecture recycled, it's also reasonable to assume the design will be relatively cost-effective and allow Intel to sell at very competitive prices while still maintaining healthy profit margins.

It's a classic case of disruption. Intel enters and Nvidia is happy to leave because there's a segment above that's much more attractive to pursue. Continue along the same lines until there's nowhere for Nvidia to run, at which point the game ends - circle of disruption complete. See also Silicon Graphics, Nvidia's predecessor in many ways.

Re:Classic case of disruption (5, Insightful)

eddy (18759) | more than 5 years ago | (#24726009)

>ATI hanging onto a somewhat insignificant market share.

C'mon, 17 million units shipped in a quarter and ~20% of the market is hardly 'a somewhat insignificant market share' in a market with four major players (Intel, nVidia, VIA).

For comparison, take Matrox, they have insignificant market share with about 100K/q

Re:Classic case of disruption (2, Insightful)

pavon (30274) | more than 5 years ago | (#24726669)

No, he is saying that ATI has an insignificant portion of the low-end market, which is true. Both ATI and NVIDIA cards are now seen as upgrades to the default Intel chipset in practically every laptop sold, whereas in the past they provided both the low-end and high-end cards for that market.

Re:Classic case of disruption (2, Insightful)

Calinous (985536) | more than 5 years ago | (#24726073)

Intel always was the biggest graphic chips provider - and I don't think was ever below 40% of the total market (by numbers at least). With all their expensive and cheap graphic chips, ATI and NVidia were unable to dethrone Intel's integrated graphic division.

Re:Classic case of disruption (5, Insightful)

cnettel (836611) | more than 5 years ago | (#24726307)

Ten years ago, the Riva TNT was yet a few months away. S3 and ATI both had a great marketshare for low to mid-end, and 3dfx dominated the very top segment for gamers.

I always require intel chipsets when I purchase... (1)

msevior (145103) | more than 5 years ago | (#24725791)

Because the drivers are open source and work out the box on every modern Linux distro.

I like my compiz eye-candy and Intel delivers more than enough performance for it.

Re:I always require intel chipsets when I purchase (0)

Anonymous Coward | more than 5 years ago | (#24725895)

I'd like it even better with FBO support which was supposed to be right around the corner some year ago...

Re:I always require intel chipsets when I purchase (0)

Anonymous Coward | more than 5 years ago | (#24726637)

I think NVidia missed a trick, a BSD/GPL driver that tailed the proprietry blob by a couple of years would mean I'd still be running their chipsets. As it is, Intel are _the_ vendor if you plan to run a F/OSS based system.

I'll tell you what will be scathing.. (2, Insightful)

sleeponthemic (1253494) | more than 5 years ago | (#24725801)

If, in the future, the trend evolves that all gpu's are integrated.

Intel, nvidia, AMD and ATI...

Who is the odd one out there?

AMD is in the Best Position (5, Interesting)

Patoski (121455) | more than 5 years ago | (#24725807)

Lots of people here and analysts have written off AMD. I think AMD is in a great position if they can survive their short term debt problems which is looking increasingly likely.

Consider the following:

  • Intel's GPU tech is terrible.
  • Nvidia doesn't have an x86 design / manufacturing experience, x86 license, or even x86 technology they want to announce.
  • AMD currently has the best GPU technology and their technology is very close to Intel's for CPUs.

AMD is in a great position like no other company to capitalize on the coming CPU / GPU convergence. Everyone jeered when AMD bought ATI but it is looking to be a great strategic move if they can execute on their strategy.

AMD has the best mix of technology, they just have to put it to good use.

Re:AMD is in the Best Position (4, Interesting)

Hektor_Troy (262592) | more than 5 years ago | (#24725927)

Well, if you read the reviews, AMDs integrated graphics sollution 780g kicks ass. Only the very very newest Intel integrated chipset is slightly better, but that uses around 20W compared to AMD's chipset's 1W

And when you move up to 790gx with side port ram.. (1)

Joe The Dragon (967727) | more than 5 years ago | (#24726263)

And when you move up to 790gx with side port ram then AMD systems gets even faster with out useing system ram. Also intel poor drivers are unlikely to make any INTEL GPU good.

Re:AMD is in the Best Position (2, Interesting)

Bender_ (179208) | more than 5 years ago | (#24725969)

Just looking at this from a manufacturing side:

AMD is roughly two years behind Intel in semiconductor process technology. Due to this and other reasons (SOI, R&D/SGAA vs. revenue) they are in a very bad cost position. Even if they have a better design, Intel is easily able to offset this with pure manufacturing power.

The playground is more level for Nvidia vs. ATI since both rely on foundries.

It's tough to tell whether ATI/AMD will be able to capitalize on this situation. They are very lucky to have a new opportunity, otherwise they wold be toast.

Two things are for certain: Nvidia is getting into rougher waters soon and Intel will not give up on this one easily.

Re:AMD is in the Best Position (1)

ZosX (517789) | more than 5 years ago | (#24726081)

This is certainly true as well. When you manufacture on a smaller fabrication process you can make more CPUs with the same material. Intel spent billions to stay ahead of the curve and it has enabled them to solidify their position. AMD is playing catchup in a serious way and what early gains they made (x86-64, multi-core, low power) have been adopted by intel. Don't worry amd is not going anywhere. Intel will always need a second x86 chip manufacturer in the market to avoid further anti-trust litigation, hence the cross licensing agreements that have handed the AMD64 instruction set to intel. But, hey we all needed a standard and I'm pretty glad the Itanic never really got all that far, along with the 960 and all of intel's other not so great ideas.

Re:AMD is in the Best Position (0)

Anonymous Coward | more than 5 years ago | (#24726209)

"AMD is roughly two years behind Intel in semiconductor process technology."

"The playground is more level for Nvidia vs. ATI since both rely on foundries."

But how much does having your own process technology matter these days, versus contracting out to a foundary?

I'm asking this as a sincere question and without really understanding the issues in depth. But somebody around here must have an opinion, maybe even a knowledgeable one (:-)). I know that Intel has a big lead in terms of their own factory/manufacturing capability versus AMD, but how much of an advantage is that if you can buy the capability as a service? Or is CPU manufacture so much more specialized and cutting-edge than GPU manufacturing that you must have your own to be competitive?

Re:AMD is in the Best Position (4, Interesting)

TheRaven64 (641858) | more than 5 years ago | (#24726029)

Nvidia doesn't have an x86 design / manufacturing experience, x86 license, or even x86 technology they want to announce

True. They do, however, have an ARM Cortex A8 system on chip, sporting up to four Cortex cores and an nVidia GPU in a package that consumes under 1W (down to under 250mW for the low-end parts). Considering the fact that the ARM market is currently an order of magnitude bigger than the x86 market and growing around two-three times faster, I'd say they're in a pretty good position.

Re:AMD is in the Best Position (0)

Anonymous Coward | more than 5 years ago | (#24726247)

Finally someone in this thread who knows what they are talking about! Thank you!

Just missing good mobo chipsets (1)

HalAtWork (926717) | more than 5 years ago | (#24726037)

With VIA going and nVidia rumored to stop developing chipsets (at least it won't make it any easier for AMD/ATI even if they continue), AMD is missing someone to develop and manufacture good motherboard chipsets.

Spider platform (2, Informative)

DrYak (748999) | more than 5 years ago | (#24726667)

AMD is missing someone to develop and manufacture good motherboard chipsets.

Haven't been following the news recently ?!?

ATI/AMD latest serie of chipsets (the 790) is quite good. That's the reason why VIA announced dropping that market in the first place.

The only problem is that currently, nVidia's SLI is a proprietary technology requiring licensing. So that's why a lot of player still buy nvidia's chipsets and avoid ATI's - not that these are bad, on the contrary, but they only lack the license required for SLI.

This SLI problem is also explaining why nVidia may have to consider stopping producing intel chipset : They never licensed their SLI technology to Intel to have SLI-compatible Intel-made chipsets. (Either forcing gamers to use nVidia chipsets or requiring convoluted hacks with SLI chipsets acting as bridges between the main northbridge and the GPUs as in Skulltrail).
And Intel is now retaliating by refusing nVidia access to QuickPath Interconnect.

So either nVidia will have to drop the Intel chipset market (and only produce SLI-bridge like in the Skulltrail hack).
Or nVidia will have to give possibility to license SLI, and thus lose an interesting market that they had managed to lock.
Hence the rumors you mention.

Nonetheless they aren't going to stop producing chipsets for AMD (still popular among gamer) nor for VIA (they have even announced new chipset able to play DX10 games and Vista in all its aero glory on VIA Isaiah ITX platforms)

Re:Spider platform (0)

Anonymous Coward | more than 5 years ago | (#24726691)

Sure but AMD/ATI will only put out a chipset that is satisfactory until 3rd parties swoop in and cover production of that. AMD has said in the past that they are not interested in creating and marketing chipsets, they only want to provide a base from which 3rd parties can develop their own products.

Re:AMD is in the Best Position (5, Interesting)

ZosX (517789) | more than 5 years ago | (#24726057)

Nvidia does indeed have license to x86. They acquired it when they bought all of 3dfx's intellectual property. They in fact manufacture a 386SX clone. Rumors have been persisting that they are looking to enter the x86 market. It should be noted that they are still relative outsiders in that their licensing doesn't extend into the x86-64 instruction set, which is taking over the market now.

THIS JUST IN! (2, Funny)

Anonymous Coward | more than 5 years ago | (#24725843)

Company says competitor's product sucks! News at 11.

So Larrabee large will be the equivalent... (1)

voss (52565) | more than 5 years ago | (#24725847)

of a Geforce 8 series(which came out in 2006)???

EXCELLENT!

Thanks AMD for suggesting intel, Im gonna save lots of money on my next motherboard by not needing an nvidia graphics card!

in raw power he's probably right (1)

Z80a (971949) | more than 5 years ago | (#24725849)

but the extra programability larrabee have as its just a bunch of cpus with some gpu instructions,this wont allow some kind of workarounds,optimizations and diferent raster techniques that make it overcomes the raw power barrier?

to not mention the ressurection of some techniques that never catched on in fixed triangle rendering hardwares like nurbs,voxels and etc?

Re:in raw power he's probably right (2, Insightful)

TeknoHog (164938) | more than 5 years ago | (#24725941)

but the extra programability larrabee have as its just a bunch of cpus with some gpu instructions

Agreed -- why stick to GPU applications, when you have a general purpose multicore machine? How about getting those new instructions into general usage -- remember how MMX was originally introduced for stuff we now run on GPUs.

As for traditional GPU applications, there's already an OpenGL driver for the Cell SPUs in development. A similar driver for a generic multicore machine would be nice, particularly if it's not limited to Larrabee and x86. Of course we already have software implementations of OpenGL, but I wonder how well those scale with dozens of CPUs.

Attention nvidia fanbois (0, Troll)

fbhua (782392) | more than 5 years ago | (#24725857)

I've seen a lot of posts lately claiming that ATI's superiority is "subjective at best" and nvidia still offers the "Best performance at a certain price level". Now you have it straight from the horse's mouth. What do you say to this?

What bullshit. (5, Interesting)

Anonymous Coward | more than 5 years ago | (#24725859)

From the SIGGRAPH paper they need something like 25 cores to run GoW at 60Hz. That's 1Ghz cores for comparison though. LRB will probably run at something like 3Ghz, meaning you only need like 8-9 cores to run GoW at 60, and with benchmarks stretching up to 48 cores you can see that this has the potential of being very fast indeed.

More importantly, the LRB has much better utilization since there aren't any fixed function divisions in the hardware. E.g. most of the time you're not using the blend units. So why have all that hardware for doing floating point maths in the blending units when 99% of the time you're not actually using it? On LRB everything is utilized all the time. Blending, interpolation, stencil/alpha testing etc. is all done using the same functionality, meaning that when you turn something off (like blending) you get better performance rather than just leaving parts of your chip idle.

I'd also like to point out that having a software pipeline means faster iteration, meaning that they have a huge opportunity to simply out-optimize nvidida and amd, even for the D3D/OGL pipelines.

Furthermore, imagine intel suppyling half a dozen "profiles" for their pipeline where they optimize for various scenarios (e.g. deferred rendering, shadow volume heavy rendering, etc. etc.). The user can then try each with their games and run each game with a slightly different profile. More importantly, however, is that new games could just spend 30 minutes figuring out which profile suits them best, set a flag in the registry somewhere, and automatically get a big boost on LRB cards. That's a tiny amount of work to get LRB-specific performance wins.

The next step in LRB-specific optimizations is to allow developers to essentially set up a LRB-config file for their title with lots of variables and tuning (remember that LRB uses a JIT compiled inner-loop that combines the setup, tests, pixel shader etc.). This would again be a very simple thing to do (and intel would probably do it for you if your title is high profile enough), and could potentially give you a massive win.

And then of course the next step after that is LRB-specific code. I.e. you write stuff outside D3D/OGL to leverage the LRB specifically. This probably won't happen for many games, but you only need to convince Tim Sweeney and Carmack to do it, and then most of the high profile games will benefit automatically (through licensing). My guess is that you don't need to do much convincing. I'm a graphcis programmer myself and I'm gagging to get my hands on one of these chips! If/when we do I'll be at work on weekends and holidays coding up cool tech for it. I'd be surprised if Sweeney/Carmack aren't the same.

I think LRB can be plenty competitive with nvidia and amd using the standard pipelines, and there's a very appealing low-fricion path for developers to take to leverage the LRB specifically with varying degrees of effort.

Re:What bullshit. (1)

ZosX (517789) | more than 5 years ago | (#24726107)

You forgot to mention that these will likely be installed on a huge portion of the desktop market. Better performance than intel's past offerings on the integrated market will be extremely welcome to game developers and who doesn't want to expand their market?

Re:What bullshit. (1)

nbates (1049990) | more than 5 years ago | (#24726115)

I read this in wikipedia:

"A June 2007 PC Watch article suggests that the first Larrabee chips will feature 32 x86 processor cores and come out in late 2009, fabricated on a 45 nanometer process. Chips with a few defective cores due to yield issues will be sold as a 24-core version. Later in 2010 Larrabee will be shrunk for a 32 nanometer fabrication process which will enable a 48 core version."

I'm not sure if this has anything to do.

Re:What bullshit. (0)

Anonymous Coward | more than 5 years ago | (#24726553)

Too bad their SIGGRAPH presentation was so disappointing, it was just a hidden commercial, lacking any detail. A lot of graphics researchers (but who cares about these guys anyway, right?) were pissed of by this blatant abuse of the SIGGRAPH papers system.

Good future ahead (1)

Piranhaa (672441) | more than 5 years ago | (#24725869)

Regardless of Larrabee being crap performance when it's released in 2010, this is a step in the right direction even if Intel doesn't know how to make very good graphics cards. In time, I'm sure, there won't be a difference between CPU/GPU, and all the memory will be shared (for the majority of systems). We'll be looking back saying "wow, why would anyone want that extra memory from the graphics card just sitting there while my system has maxed it's main physical memory?". If Intel drops into the graphics market, with AMD already there and NVIDIA to soon follow (if the rumours are correct), it really looks like an interesting future. Good luck boys!

Re:Good future ahead (1)

eric-x (1348097) | more than 5 years ago | (#24726651)

Unless system-memory and bus become a order of magnitude faster than the CPU can handle, sharing system-memory with the GPU will slow down the CPU.

One solution I can envision is to have a second memory bank (like going from dual channel to quad channel). When you move all the memory for CPU to bank 1 and all the texture/geometry to bank 2 then the GPU and CPU do not have to wait for each other. Extra wait cycles are introduced when CPU data spills over into the GPU bank, and vica versa.

Intel chipsets will allway work better for me... (0, Troll)

Simon (S2) (600188) | more than 5 years ago | (#24725883)

... than the NVidia ones. Intel has Open Source drivers, NVidia not. So, NVidia, even if your cards are from 2010, and Intels are from 2006, I'll buy theirs because they work better and out of the box on my home desktops. When you will be ready to release open drivers for your hardware you can start to compare your products to that of your competitors. Even AMD understood that.

GPU from 2006? (0)

Anonymous Coward | more than 5 years ago | (#24725951)

What, like the 8800GTX? A card that has only within the past quarter or so been bested in every benchmark?

KDE users will be confused (1)

drolive (1338037) | more than 5 years ago | (#24726083)

"[....] the "large" Larrabee in 2010 will have roughly the same performance as a 2006 GPU from Nvidia or ATI."

As a KDE 4 user, I thought it was the 2008 Nvidia GPUs that had the same performance as the Intel Pentium III Tualatins from 2002.

I'f prefer stable releae from 2006... (4, Interesting)

S3D (745318) | more than 5 years ago | (#24726135)

With all OpenGL extensions supported working properly, to latest and greatest from NVIDIA where I can never be sure which extension work on which driver with which card.

They're all doing it wrong (1)

Louis Savain (65843) | more than 5 years ago | (#24726145)

Whether it is GPU or CPU or GPGPU, they're all missing the mark, IMO. Those chips are a pain in the ass to program and they are not universal. Mixing MIMD parallelism with SIMD parallelism is a match made in hell. Multithreading is seriously flawed. Intel knows that. That's why they're so busy working on domain-specific dev tools to keep the programmer insulated from all the nastiness. It's not going to work because there is no flexibility. The industry is in dire need of a seismic paradigm shift and the longer it waits, the more painful it's going to be down the road. To find out how to solve the parallel programming crisis, read Transforming the TILE64 into a Kick-Ass Parallel Machine [blogspot.com].

Re:They're all doing it wrong (1)

Courageous (228506) | more than 5 years ago | (#24726445)

If you think GPGPU is hard to program, you are using the wrong tools. Check out RapidMind. It's very easy.

C//

so... (2, Funny)

800DeadCCs (996359) | more than 5 years ago | (#24726165)

So intel will only be about 4 years behind current in their graphics system when it comes out.
In that case, it's probably the biggest leap they'll have ever made.

I'm so fing tired of meglomanical corporations (0)

Anonymous Coward | more than 5 years ago | (#24726185)

Why do these over-paid executives get away with squandering corporate resources on a wild goose chase? Sadly, from my perspective, egoism and incompetence seem to be all too common at the top these days.

Re:I'm so fing tired of meglomanical corporations (1)

Lonewolf666 (259450) | more than 5 years ago | (#24726201)

Funny, I see it the other way round:
Trying something innovative has become rare, most executives prefer to go for the next iteration of "$ProvenProduct". What you call a wild goose chase, I call a refreshing attempt to do something new and better. Of course there is a risk of failure, but Intel can afford taking some risks.

Peter Glaskowsky is clueless (0)

Anonymous Coward | more than 5 years ago | (#24726205)

If you check this response [cnet.com] to his blog article, you can see someone has already debunked most of his claims about expected Larrabee performance. He sounds like he has no idea how graphics work.

If this is nVidia's expert witness, maybe they ARE scared about Larrabee.

Re:Peter Glaskowsky is clueless (1)

rauxbaught (1350387) | more than 5 years ago | (#24726325)

I actually posted that response, and Paul Glaskowsky is definitely not clueless. The article they're referring to, however, is pretty bad.

Curious how the shoe is on the other foot (1)

A_Non_Moose (413034) | more than 5 years ago | (#24726253)

FTA:

He [Mottram] also predicted that ATI would regret its focus on raw graphical power at the expense of more general-purpose capabilities.

Funny how in the 9500pro's release there was such a focus on not using raw video horsepower to draw frames, but use occlusion and other things to save time/memory/bandwidth and increase speed in rendering.

Versus the Gforce line of depending on raw horsepower and drawing everthing in a scene.

Brute force or thinking ahead?

I have to admit, I like the idea that Kneo24 had upthread: Make a game's speed depend more on the grfx card, not just the system CPU (+card) but CPU on the vid card.

Interesting idea that would make upgrades more appealing.

Larrabee is marketing fail (0)

Anonymous Coward | more than 5 years ago | (#24726417)

Intel's marketing department must have failed Marketing 101: Perception is Reality.

The perception - and therefore reality - is that Intel graphics is suck. Any gamer knows that. Hence Larrabee is suck. It doesn't matter that Larrabee is a completely different product with completely different technology. It's Intel graphics and suck.

That's why they should position Larrabee as a GPGPU killer that you can use to build super computers and render farms out of. They should focus on things like sane memory management, cache coherency, branches, running normal C programs, etc. GPU's suck at that stuff which makes GPGPU a royal pain in the ass.

The positioning statement is really simple: "Tons of floating point without the pain."

Re:Larrabee is marketing fail (1)

QuoteMstr (55051) | more than 5 years ago | (#24726619)

So you're claiming perceptions can never change, so Intel should just give up? More competition in this arena can only be good for us.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...