Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Abandons Discrete Graphics

timothy posted more than 4 years ago | from the out-of-the-lifeboat dept.

Graphics 165

Stoobalou writes with this excerpt from Thinq: "Paul Otellini may think there's still life in Intel's Larrabee discrete graphics project, but the other guys at Intel don't appear to share his optimism. Intel's director of product and technology media relations, Bill Kircos, has just written a blog about Intel's graphics strategy, revealing that any plans for a discrete graphics card have been shelved for at least the foreseeable future. 'We will not bring a discrete graphics product to market,' stated Kircos, 'at least in the short-term.' He added that Intel had 'missed some key product milestones' in the development of the discrete Larrabee product, and said that the company's graphics division is now 'focused on processor graphics.'"

cancel ×

165 comments

Sorry! There are no comments related to the filter you selected.

Groan (3, Insightful)

Winckle (870180) | more than 4 years ago | (#32351202)

I hope they at least manage to incorporate some of what they've learnt into their integrated chips.

Intel's integrated chips have been appallingly bad in the past, some incapable of decoding HD video with reasonable performance. Manufacturers using those intel integrated chips in their consumer level computers did a great deal of harm to the computer games industry.

Re:Groan (0)

Anonymous Coward | more than 4 years ago | (#32351310)

You mean like Apple using the GMA950 and X3100 in their early intel Mac minis and MacBooks?

Starcraft II, Diablo 3, Steam on Mac OS X.... all great news except we can't use any of it because the intel integrated GPUs SUCK!

Re:Groan (0)

Anonymous Coward | more than 4 years ago | (#32351436)

Why not refer to something that's less than 5 years old? "Intel HD Graphics" on the I5 661's @ 900MHz. runs general games very well, let alone what MOST people need.

Re:Groan (1)

toastar (573882) | more than 4 years ago | (#32351536)

You mean like Apple using the GMA950 and X3100 in their early intel Mac minis and MacBooks?

Starcraft II, Diablo 3, Steam on Mac OS X.... all great news except we can't use any of it because the intel integrated GPUs SUCK!

Starcraft 2 chokes on my radeon 5430

Re:Groan (1, Insightful)

Anonymous Coward | more than 4 years ago | (#32351752)

That isn't saying much considering that the 5430 is an extremely low end GPU.

Re:Groan (1)

Jeng (926980) | more than 4 years ago | (#32351920)

Video cards for gaming.

Memory bandwidth within the video card is one of the best ways to grade a video card.

It doesn't matter much how much video Ram it has, its the speed.

Having a newer video card helps a little, but the big thing is the video memory bandwidth.

Video processing is secondary for gaming because the primary bottleneck is the memory bandwidth.

If you have a low end, but new video card it will perform comparably to the older version of the same card.

If you have an older premium card it will still perform better than a newer budget card primarily because of the memory bandwidth.

Re:Groan (1)

Creepy (93888) | more than 4 years ago | (#32353076)

That isn't really true - memory bandwidths have a fairly minor impact, similar to CPU/memory - maybe 5% of the speed of a graphics card is from latencies due to memory (either bus or strobe latencies). The rest of the time is spent doing transforms, running shaders, doing depth testing, etc, and those processes depend on the GPU clock. The old AGP shared memory model is actually creeping back in (look at the G100s, for example), especially in the mobile processing area because memory bandwidth matters so little. There once was a time when main memory to GPU memory was a serious throttle, but those days died around AGP 4x and haven't returned.

Memory itself is a pain to compare, as it may be your GDDR5 is faster clock-wise than GDDR4, but the latencies negate any speed gain, same as for processor memory. To further complicate things, depending on burst size and how many times it has to call the strobes (e.g. RAS to CAS latency), you may get mixed results.

Re:Groan (1)

FreonTrip (694097) | more than 4 years ago | (#32352206)

True, but on the bright side you can now host a dedicated server on those systems without resorting to running Windows-in-a-box or rebooting.

Re:Groan (5, Informative)

Pojut (1027544) | more than 4 years ago | (#32351444)

For anyone stuck with an Intel GMA chipset: GMA Booster [gmabooster.com] may help solve some of your problems. Just make sure you have a decent cooling solution [newegg.com] , as it can ramp up the heat output of your system considerably. Still, if you're stuck with GMA, it can make the difference between a game being unplayable and being smooth.

Limited to 950 (5, Informative)

manekineko2 (1052430) | more than 4 years ago | (#32351948)

Note for anyone else whose curiosity was piqued, this only works with 32bit systems with 950 chipset based systems, and does not work with GMA X3100, GMA X4500, GMA 500, or GMA 900.

Re:Groan (0)

Anonymous Coward | more than 4 years ago | (#32352102)

I don't know about that. It reminds of those "386to486 converter" programs that used to float around the old BBSs. Anything that claims to magically transforms hardware via software should be viewed with extreme skepticism.

Re:Groan (1)

Pojut (1027544) | more than 4 years ago | (#32352176)

I've tried it, it definitely works. It "transforms" the hardware by greatly overclocking it :-)

Re:Groan (1)

TheRaven64 (641858) | more than 4 years ago | (#32352410)

Those 386to486 converter programs installed a hander for the illegal instruction trap, caught it, and emulated the dozen or so that were introduced with the 486. They let you run programs that required a 486 (rather than having them crash with an illegal instruction error), they just ran a lot slower than on a real 486.

This is completely different; it just tweaks the clock speed of the hardware. Given that most people who have integrated graphics and can't upgrade are laptop users, who have limited and non-expandable cooling, it's a terrible idea.

So Intels next cpu will the same suck video build (0, Troll)

Joe The Dragon (967727) | more than 4 years ago | (#32351208)

So Intel's next cpu will the same suck video build in?

AMD will kill them.

Re:So Intels next cpu will the same suck video bui (0)

Anonymous Coward | more than 4 years ago | (#32351286)

ignoring you're complete inability to form a sentence, this article is about their discrete graphics line (like it says in the fucking title), not their integrated graphics. So while their next chip may or may not have "the same suck video" it's completely irrelevant to this conversation.

Re:So Intels next cpu will the same suck video bui (2, Funny)

Mr. Underbridge (666784) | more than 4 years ago | (#32351346)

ignoring you're complete inability to form a sentence

Hey everybody, 'tard fight! Come watch!

Re:So Intels next cpu will the same suck video bui (0, Offtopic)

oatworm (969674) | more than 4 years ago | (#32351520)

First person to count to "potato" wins!

Re:So Intels next cpu will the same suck video bui (2, Funny)

Anonymous Coward | more than 4 years ago | (#32351584)

its potatoe you dumb fuck.

Intel planed to put this tech into the next cpu an (1)

Joe The Dragon (967727) | more than 4 years ago | (#32351422)

Intel planed to put this tech into the next cpu and this seems to be dead so what will intel do?

Re:Intel planed to put this tech into the next cpu (0)

Anonymous Coward | more than 4 years ago | (#32351494)

Intel planed to put this tech into the next cpu and this seems to be dead so what will intel do?

According to who? Intel has always said Larrabee would be a discrete GPGPU based of x86 instruction set. I've never seen anything from Intel saying this would be used on an integrated chip.

Re:Intel planed to put this tech into the next cpu (0)

Anonymous Coward | more than 4 years ago | (#32352818)

Larrabee was never intended to go into any next-gen Intel CPU, so it's not a problem for them at least in that respect.

Re:So Intels next cpu will the same suck video bui (0, Offtopic)

odin1899 (1193905) | more than 4 years ago | (#32351484)

I'll just ignore your complete inability to distinguish between a possessive and a verb contraction then.

Re:So Intels next cpu will the same suck video bui (0)

Anonymous Coward | more than 4 years ago | (#32351606)

ignoring you're complete inability to form a sentence

Never fails. Guy pulls a grammar faux pas when being a grammar Nazi to another poster. FFAS.

Mod parent "Likely." (4, Informative)

Spazntwich (208070) | more than 4 years ago | (#32351316)

Short of buying out Nvidia I don't see Intel having a consumer's chance in America of competing with AMD in the value sector for the next few generations of chips.

CPUs have been "fast enough" for years, but GPUs have not. AMD is going to laugh all the way to the bank being able to offer a $50 package that can run The Sims.

Re:Mod parent "Likely." (4, Insightful)

TheRaven64 (641858) | more than 4 years ago | (#32351394)

CPUs have been "fast enough" for years, but GPUs have not.

Really? I think you might want to take a look at what most people use their GPUs for. Unless you are a gamer, or want to watch 1080p H.264 on a slightly older CPU, a 4-5 generation old GPU is more than adequate. My current laptop is 3.5 years old, and I can't remember ever doing anything on it that the GPU couldn't handle. As long as you've got decent compositing speed and pixel shaders for a few GUI effects, pretty much any GPU from the last few years is fast enough for a typical user.

Re:Mod parent "Likely." (4, Informative)

Korin43 (881732) | more than 4 years ago | (#32351598)

As long as you've got decent compositing speed and pixel shaders for a few GUI effects, pretty much any ATI or nVidia GPU from the last few years is fast enough for a typical user.

Fixed that for you. Intel cards are fine for "normal" computer usage, but they still suck pretty bad at most games.

Re:Mod parent "Likely." (1)

LinuxIsGarbage (1658307) | more than 4 years ago | (#32351608)

Indeed. GMA950 was crappy during it's prime compared to ATI and nVidia shared memory bargain bin offerings 4 years ago, however it's still being shipped in N270/N280 netbooks, and is capable of running Aero fine. Probably H.264 decoding might be (or will be) beneficial to casual users than gaming performance, and I believe many of Intel's mainstream desktop and notebook GPUs provide it.

Re:Mod parent "Likely." (1)

tepples (727027) | more than 4 years ago | (#32351618)

Unless you are a gamer, or want to watch 1080p H.264 on a slightly older CPU, a 4-5 generation old GPU is more than adequate.

But if you're a game developer, you have an interest in members of the public having PCs with more powerful GPUs, not the Voodoo3-equivalent without even hardware T&L that is a GMA 950.

Re:Mod parent "Likely." (0)

Anonymous Coward | more than 4 years ago | (#32351744)

The Voodoo 3 was bitching for its time. The GMA 950, however was not. Not to mention that the Voodoo 3 can play Doom 3 [firingsquad.com] (well it's a Voodoo 2, but can a GMA950 play Doom 3?). And it doesn't require any "flashlight mod".

Re:Mod parent "Likely." (1)

tepples (727027) | more than 4 years ago | (#32352524)

The Voodoo 3 was bitching for its time.

And its time was a decade ago.

Re:Mod parent "Likely." (1)

FreonTrip (694097) | more than 4 years ago | (#32352822)

*sigh* All right, I've got karma to burn.

The Voodoo3's limitations weren't trivial even for its time: 256x256 texture dimensions, forced 16-bit color, no real stencil buffer support, framebuffer size maxed out at 16 MB... A former 3dfx employee literally told me that it was a die-shrunk, bug-fixed Voodoo Banshee with an extra TMU popped onto its single lonely rendering pipeline. I owned one, and liked it tremendously, but it's based on technology first debuted 13 years ago.

As for Doom 3: yes, a MesaGL --> Glide wrapper exists that provides basic rendering functionality for Doom 3 and tricks the engine into sending its lighting straight to /dev/null/*. There are other wrappers that perform similar tricks for any vendor's GL drivers, and they will let you run on a GMA950 substantially better than the Voodoo3 could manage. And for whatever meager stakes we're playing for, a GMA950 can actually run Doom 3 with lighting when backed up by a dual-core CPU.++

* Metaphorically speaking.

++ Flexibility aside, software vertex shaders suck.

Re:Mod parent "Likely." (1)

Spazntwich (208070) | more than 4 years ago | (#32352058)

All I really meant by that is incremental improvements in GPU performance hold significantly greater "value" than commensurate increases in CPU speed. A bottom barrel consumer computer is going to be able to handle anything a common consumer is going to throw at it except gaming. Even the lightest gaming is about impossible on the most common consumer-class (Intel) GPUs, and unfortunately Intel is still the graphics chip in the vast majority of consumer computers.

Now bottom-barrel will still mean "game capable."

Intel's and AMD's offerings at the same price point will likely offer a 10%-20% discrepancy in overall CPU performance that will favor Intel but nobody will ever notice, while at the same price point AMD will be able to offer graphical performance eclipsing Intel's by several fold.

AMD is going to eat Intel's lunch, and this will be good news for all of us as the continuously floating "average" performance developers can target increases, but I digress.

depends on GP-GPU (1)

Creepy (93888) | more than 4 years ago | (#32353600)

I would say that is currently true, but the average user may care if General Purpose GPU (GP-GPU) takes off and they use applications that use it. For a speed example, I had what is essentially a math problem that kept a dual core CPU busy (and yes, it was threaded) for 2 weeks, 3 days, 14 hours. The same problem tackled by 216 GPU shaders and one CPU took around 25 minutes. While neither

    I realize most people aren't doing surface detail analysis involving trillions of points of data like I was (actually, that was a brute force method, too - probably could have it down to about a day, optimized), but I know a lot of people that use Photoshop, and imagine the same sort of gains for certain filters. Filters that were too slow to incorporate 5 years ago may be possible today.

Re:Mod parent "Likely." (0)

Anonymous Coward | more than 4 years ago | (#32351416)

They wouldn't need to buy them in order to create an integrated offering. Heck, they're already doing it with the Atom/ION product line. If AMD makes headway with all-in-one solutions, it will be damaging to both Intel and Nvidia and I'd put money on them teaming up to compete.

Re:Mod parent "Likely." (1)

LinuxIsGarbage (1658307) | more than 4 years ago | (#32351996)

Intel resisted letting ION be bundled with Atom chips. They preferred to continue releasing anemic GMA950. The CPU wasn't capable of rendering HD, it couldn't be offloaded to the GPU, and they wanted to keep it that way.

Re:Mod parent "Likely." (1)

LinuxIsGarbage (1658307) | more than 4 years ago | (#32352038)

Also, this sounds like Intel wants to team up with nVidia. nVidia manages to put out something better than Intel, so Intel pulls their licence. http://arstechnica.com/hardware/news/2009/03/nvidia-countersues-intelright-on-schedule.ars [arstechnica.com]

Re:Mod parent "Likely." (1)

ElectricTurtle (1171201) | more than 4 years ago | (#32352440)

Intel and nVidia are pretty openly hostile [intelsinsides.com] to each other. (Though as represented by Jen-Hsun Huang, nVidia is hostile to just about everybody. nVidia was actually AMD's first choice for their merger to create the Fusion platform, but Jen-Hsun Huang's ego was too huge to compromise and AMD was a little short on the capital necessary at the time to be in a strong bargaining position.) However with Larrabee in the trash heap of history, Intel needs nVidia now more than ever, but unless Intel really comes groveling back to Jen-Hsun with a killer sweetheart deal of some kind, I can't see any kind of productive partnership between the two companies in time to compete with AMD's Fusion.

As painful as the AMD/ATI merger was, I do think that Fusion is poised to be a game changing architecture, just as important if not significantly moreso than the integration of floating point processing and memory controllers into CPU dies.

Re:Mod parent "Likely." (1)

Jackie_Chan_Fan (730745) | more than 4 years ago | (#32352288)

Cpu's have never been fast enough :) My Quad Core could be a dual 8 core.. and I'm still at the mercy of the CPU while rendering.

Anyone doing music production at home, or professional, or 3d graphics at home or professional... or photoshop work at home or.. well you get it.

CPU is and will always be a factor. That son of a bitch is never fast enough for me :)

Re:So Intels next cpu will the same suck video bui (0)

Anonymous Coward | more than 4 years ago | (#32351538)

I buy AMD chips, *but* I would not pay for AMD graphics. Reason is it doesn't run nearly as well under Linux as nVidia chips.

And no, I don't really care about opensource drivers, I just want hardware I bought to work under OS I use. So for now, that will be AMD + nVidia for best bang for the buck. If AMD puts integrated graphics on their chips, then that's a negative for me as I'm not willing to pay for something that doesn't work. Intel integrated graphics work better than AMD under Linux at the moment.

Re:So Intels next cpu will the same suck video bui (1)

LinuxIsGarbage (1658307) | more than 4 years ago | (#32351958)

That's good criteria, however I don't like a vendor that releases defective junk that delaminates from the mounting package, rendering the entire system useless under any OS: http://hplies.com/ [hplies.com]

Re:So Intels next cpu will the same suck video bui (1)

ElectricTurtle (1171201) | more than 4 years ago | (#32352272)

This is important, and a reason that I'm likely to buy an AMD/ATI card for my next upgrade after being an nVidia loyalist for the better part of a decade.

Re:So Intels next cpu will the same suck video bui (0)

Anonymous Coward | more than 4 years ago | (#32352400)

Actually the RadeonHD project has drivers available for the AMD chipsets sold on motherboards today. Just last month I bought an Asus mainboard with integrated ATi graphics in its AMD chipset and it works flawlessly under Ubuntu 10.04. Just Google the specs of the chipset and then the Linux/X.org/RadeonHD support for it. These days I'd rather have an integrated ATi that works conveniently and efficiently for less then having an nVidia add-on board with its own cooling and a proprietary blob of code that no one except nVidia can fix. I'd actually be suprised if you can buy an AMD chipset with integrated graphics that's not supported by the latest distros. I haven't tried ATi add-on boards but RadeonHD has support for those as well.

As for Intel graphics, well, you get what you pay for.

Re:So Intels next cpu will the same suck video bui (1)

Nadaka (224565) | more than 4 years ago | (#32353084)

Same here. I am running a Asus motherboard with the Radeon HD 3300 graphics chipset, dual booting 9.10, now 10.04 and windows 7. I have drivers available under both OS's and the hardware is plenty good enough for everything that I have thrown at it (admittedly, I am not exactly playing crysis, but it decodes 720p without a hiccup and plays all the games that I both have and like).

Not really (4, Insightful)

Sycraft-fu (314770) | more than 4 years ago | (#32351544)

Everyone gets up on Intel integrated GPUs because they are slow, but they are looking at it from a gamer perspective. Yes, they suck ass for games, however that is NOT what they are for. Their intended purpose is to be cheap solutions for basic video, including things like Aero. This they do quite well. A modern Intel GMA does a fine job of this. They are also extremely low power, especially new newest ones that you find right on the Core i5 line in laptops.

Now what AMD may do well in is a budget gaming market. Perhaps they will roll out solutions that cost less than a discreet graphics card, but perform better than a GMA for games. That may be a market they could do well in. However they aren't going to "kill" Intel by any stretch of the imagination. For low power, non-gaming stuff using minimal power is the key and the GMA chips are great at that. For the majority of gaming, a discreet solution isn't a problem ($100 gets you a very nice gaming card these days) and can be upgraded.

Re:Not really (2, Informative)

quanticle (843097) | more than 4 years ago | (#32352108)

Their intended purpose is to be cheap solutions for basic video, including things like Aero.

Well, it depends on your definition of basic video, of course. I mean, I've seen Intel GMA chipsets struggle to display a 1080p Blu-Ray movie. Given that consumers increasingly are going to be hooking up their laptops to TVs and other larger displays, saying, "Oh, that's not basic video," isn't going to cut it.

Re:Not really (1, Insightful)

Anonymous Coward | more than 4 years ago | (#32352162)

Now what AMD may do well in is a budget gaming market. Perhaps they will roll out solutions that cost less than a discreet graphics card, but perform better than a GMA for games. That may be a market they could do well in. However they aren't going to "kill" Intel by any stretch of the imagination.

I think you're selling AMD short. Intel has always relied on its fabs to keep things competitive even when their designs weren't. Now that die shrinks are beginning to reach a point of diminishing returns, Intel cannot rely on their fabs as heavily. Also, Fusion certainly has the potential to break Intel's lock on the low power IGP market.

Ever since they ran Ruiz out, AMD has been executing brilliantly. If AMD is able to come close to equaling Intel's CPU tech and considering that ATI's GPU tech spanks Intel, how much better do you all think Bulldozer will ultimately be than Larabee? I'm guessing metric tons considering applications (i.e. Photoshop) are now utilizing GPU acceleration for stream processing (not just HD accel. and Aero).

If AMD can keep this pace up, Intel is in for some deep hurting... You can double the hurting if (and it is a big if) ARM starts moving up the hardware stack (e.g. iPad) in the next couple of years.

Re:Not really (1)

blahplusplus (757119) | more than 4 years ago | (#32352968)

"Everyone gets up on Intel integrated GPUs because they are slow, but they are looking at it from a gamer perspective. "

Everyone SHOULD get up on intel. Intel doesn't approach computers as a platform like it should, and that's a huge problem for the #1 player in the industry when it comes to CPU's and motherboard chipsets. Nvidia and AMD are the few companies approaching the PC as a _platform in itself_. The idea that the "gamers perspective" doesn't matter is short-sighted and NAIVE, imagine you told Matrox, ATI, 3Dfx and nvidia that they should not "focus on gamers", Nvidia and 3Dfx practically pioneered and industry into existence based on providing hardware to make games run faster and prettier since even today NO CPU can compete with dedicated graphics hardware in software rendering.

Neither will integration be the death knelll for discrete GPU's because people forget 1) Heat 2) Power and 3) Bandwidth, I really don't care what the naysayer say - there is no fucking way you are going to be able to push computation in an integrated CPU/GPU hybrid like you can with dedicated hardware - the bandwidth problem is ALWAYS overlooked by the integrations. Mark Reign of epic games and many others predicted "the death of discrete graphics" in 2-5 years and they've bee nsaying that since 1997-98 or so and it's now 2010 and there is NO END IN SIGHT for discrete graphics since now they are targetting high performance computing space.

Who thought a bunch of hardware guys providing chips for graphic acceleration for gamers would grow into such multi-purpose behemoths?

Re:Not really (1)

Klintus Fang (988910) | more than 4 years ago | (#32353336)

You are correct that discrete graphics isn't going anywhere, but your argument for why gaming performance is so important in integrated graphics contradicts that. I think the same argument applies there: the thing that people riding on Intel for the performance of their integrated gpu's and who are predicting that AMD will spank them in the integrated gpu space are forgetting is: 1) heat, 2) power, and 3) bw. When you integrate the GPU with the CPU you have to share the heat, power, and bw budget with the CPU.

It will be interesting to see what AMD comes up with when they have an integrated GPU, but frankly, the fact that it has taken them so long to get to that step since acquiring ATI tells me that they are likely finding it rather difficult to do well. I will be very happy if they somehow come up with something that can "spank" Intel in the integrated gpu space, but I wouldn't be surprised to find that they had to cut out so many features to get it into the power/bw budget that it ends up being...about the same.

Other point worth keeping in mind is that a huge portion of the market where intel's gpus do reign is the business desktop space where the only thing the machine needs to do is run web browsers, email apps, and office applications. Blue ray decoding? In that space, it doesn't matter.

Re:Not really (5, Insightful)

forkazoo (138186) | more than 4 years ago | (#32353338)

Everyone gets up on Intel integrated GPUs because they are slow, but they are looking at it from a gamer perspective. Yes, they suck ass for games, however that is NOT what they are for. Their intended purpose is to be cheap solutions for basic video, including things like Aero. This they do quite well. A modern Intel GMA does a fine job of this. They are also extremely low power, especially new newest ones that you find right on the Core i5 line in laptops.

Funny, at this point, I thought the purpose of Intel graphics was to try and make sure that OpenCL never becomes a viable solution. Seriously, Intel does everything in their power to make their terrible graphics chips universal. They've done some pretty shady dealing over the years to try and make it happen. At this point, they have even put their GPU's right on the CPU's of their current generation laptop chips. Apple and nVidia had to come up with dual-GPU solutions that can't be as power efficient as an Intel-only solution because they have to leave the Intel GPU also running and burning power. Intel is trying to sue nVidia out of the integrated chipset market. Examples go on and on.

Why? It isn't like Intel makes all that much money on their GPU's. It's nothing to sneeze at. Intel makes more money in a year on GPU's than I'll probably make in a lifetime, but that's peanuts on the scale of Intel. It's also not enough cash to justify the effort. But, if you look at it as a strategic move to make sure that the average consumer will never have a system that can run GPGPU code out of the box, it starts to make a little more sense. Intel is trying to compete on sheer terribleness of their GPU's, because if the average consumer has an nVidia integrated GPU in their chipset, then developers will bother to learn how to take advantage of GPU computing, which will marginalize Intel's importance.

I know it sounds kind of like a crazy conspiracy theory, but after the last several years of Intel-watching, it really does seem like quietly strangling GPGPU is a serious strategic goal for Intel.

Re:Not really (0)

Anonymous Coward | more than 4 years ago | (#32353456)

My thoughts exactly. I believe many people here vastly overestimate the importance of 3D performance in the mass market. As you say, when that's not thought of as the primary feature, GMAs start looking pretty good.

Re:So Intels next cpu will the same suck video bui (1)

petermgreen (876956) | more than 4 years ago | (#32353442)

So Intel's next cpu will the same suck video build in?
Intel are already building in their sucky (though not quite as sucky as it used to be) video into their dual core i3 and i5 chips (technically it's a multi-die module ATM but from the system integrators POV that doesn't really make any difference). With the next gen I believe they are planning to put it on-die on all their low and mid range chips (maybe the high end too, information on the next gen high end stuff seems very sketchy at the moment)

IMO Integration of graphics onto the CPU was a pretty inevitable result of Intel's decision to put the memory and fast PCIe controllers on the CPU. Putting the graphics elsewhere in such a system would either require dedicated graphics memory or having a very fast link between the CPU and the device containing the graphics with carefully designed prioritisation.

AMD will kill them.
BS, the only market segment intels integration of video into the CPU will really impact are those who currently explicitly buy laptops with nvidia chipsets to get a bit better graphics performance without the size and battery life sacrifices of a fully independent graphics soloution. Afaict those are a fairly small proportion of the laptop market.

From a desktop perspective this is no big deal. Graphics cards with performance comparable to the best integrated graphics aren't exactly expensive. Gamers will go from leaving the graphics integrated in the northbridge disabled to leaving the graphics integrated in the CPU disabled.

Plus even if AMD had a technically better solution than Intel afaict they simply don't have the production capacity to kill Intel any time soon. Nor it seems do they have the marketing.

Intel is a great manufacturer.. not designer. (2, Interesting)

Anonymous Coward | more than 4 years ago | (#32351250)

They've never been able to bring the most innovative designs to market.. they bring 'good enough' wrapped in the x86 instruction set.

If x86 was available to all I think we'd see Intel regress to a foundry business model.

Re:Intel is a great manufacturer.. not designer. (1)

gyrogeerloose (849181) | more than 4 years ago | (#32351292)

They've never been able to bring the most innovative designs to market.. they bring 'good enough' wrapped in the x86 instruction set.

And, judging by what I've heard, a lot of people would say that the x86 instruction set itself is nothing more than 'good enough.'

Re:Intel is a great manufacturer.. not designer. (1)

hedwards (940851) | more than 4 years ago | (#32352172)

It's not so much 'good enough' as it brings along several decades worth of cruft that aren't really necessary in the modern era. While Intel had a bright idea in Itanium and ditching the x86 instruction set, they greatly underestimated the amount of effort that it would take to port the code and ensure that the necessary applications were available. Ultimately it was more or less DOA as a result, it just took some time for it to become formalized.

Re:Intel is a great manufacturer.. not designer. (2, Interesting)

Jackie_Chan_Fan (730745) | more than 4 years ago | (#32352328)

I disagree. Intel has been destroying AMD these past 4 years.

AMD's 64bit instruction set, and athlons were a huge improvement where Intel had failed...

But now.. Intel's chips are faster, and AMD has been playing catch up. For a while there AMD didnt have an answer for intel's core line of cpus.

Now they do, and they're slightly cheaper than intel but they do not perform as fast as intel.

Re:Intel is a great manufacturer.. not designer. (1)

Jeng (926980) | more than 4 years ago | (#32352922)

Intels chips are faster because Intel has much better production facilities.

Re:Intel is a great manufacturer.. not designer. (1)

Nadaka (224565) | more than 4 years ago | (#32353252)

I think you may be a bit off here. AMD meets or exceeds the performance available from Intel chips at every point of the price curve except the very high end where they do not compete at all.

The i7 920 is the only real competitor to AMDs chips in price/performance, coming in a bit faster than the AMD 955/965 in both performance and cost.. Above that point, incrementally more power from Intel comes at exponentially higher costs. Below that point and AMDs chips beat everything Intel has at each price point.

Re:Intel is a great manufacturer.. not designer. (1)

Jackie_Chan_Fan (730745) | more than 4 years ago | (#32353522)

AMD does beat intel on the price curve... but not in performance. If you want the performance.. AMD has no answer for intel's cpus. I recently built a system for someone and looked at all of the cpu options. Ultimately I went with an AMD cpu for him because of his price range....

Like you said, AMD beats intel on the price... but not on the performance. If you want performance, you have to pay intels prices.

Thats why they cost more. The cpu's intel have put out these past 3 years are incredible. For a good time it looked like AMD had intel beat, but Intel really stepped up.. so much that it virtually squashed all of the good will AMD had earned with Athlon

AMD's still in the game because of their prices. They're good cpus, but intel's performance is still better.

Re:Intel is a great manufacturer.. not designer. (1)

adolf (21054) | more than 4 years ago | (#32353808)

It doesn't matter what's at the top.

If performance were the only metric one needed when selecting a product, we'd all be driving Bugatti Veyrons when we wanted to go fast, Unimogs when we want to move lots of stuff slow, and AMG Mercedes-Benz SUVs when we want to move stuff along with people.

Over here in reality, though, price is a factor. And so, Toyotas, the Hyundais, and the Chevys are a much better deal for most folks.

So, even if Bugatti made a more inexpensive and practical vehicle that I might be interested in, the fact that they also may produce the Veyron does not influence my buying decisions when comparing their offerings to those of more serene brands like Toyota.

Likewise with CPUs: I don't care what manufacturer makes the fastest chip. If the features that I'm interested in are close enough to the same, then I only I care about how much performance I can get for the amount that I'm willing to spend, and sometimes with an eye toward the power consumption.

To that end, AMD generally wins. That AMD does not have a metaphorical Veyron in their lineup does not mean that they do not produce chips which are cheap and fast for the entire gamut of typical home applications.

Re:Intel is a great manufacturer.. not designer. (1)

Nadaka (224565) | more than 4 years ago | (#32353830)

I agree that for everything from the i7 920 and up, intel is unquestionable faster. Even the new 6 core AMD chips will be able to match/beat the i7 920 at a some tasks despite similar system costs.

The AMD 955 at ~$160 outperforms the intel E8400, Q8200 and Q8400 and i5-650 available in the range of ~150 to ~190. The same goes for just about every lower price point as well.

I think that the larger section of the market lies in the low to mid range chips. I am not just talking about price, but value as well. In that range, AMD certainly has better value by having higher performance per dollar spent for just about any price point in that range.

Re:Intel is a great manufacturer.. not designer. (1)

Tumbleweed (3706) | more than 4 years ago | (#32353754)

I think you may be a bit off here. AMD meets or exceeds the performance available from Intel chips at every point of the price curve except the very high end where they do not compete at all.

Performance/price != Performance

And of course, it also depends mightily on exactly WHICH performance characteristics you're talking about.

missed milestones (4, Informative)

LinuxIsGarbage (1658307) | more than 4 years ago | (#32351276)

' He added that Intel had 'missed some key product milestones' in the development of the discrete Larrabee product,

Like proof that they were even capable of making an integrated graphics product that wasn't a pile of garbage?

GMA910: Couldn't run WDDM, thus couldn't run Aero, central to the "Vista capable" Lawsuits

GMA500: decent hardware, crappy drivers under Windows, virtually non-existant Linux drivers, worse performance than GMA950 in Netbooks.

Pressure to lockout competing video chipsets. We're lucky ION saw the light of day. http://www.pcgameshardware.com/aid,680035/Nvidia-versus-Intel-Nvidia-files-lawsuit-against-Intel/News/ [pcgameshardware.com]

Re:missed milestones (1)

KillShill (877105) | more than 4 years ago | (#32352174)

So the answer is to buy more Intel cpu's. We wouldn't want them to die and that 3 time convicted monopolist AMD be left alone to dominate...

oh wait...

"I run Linux on my shiny new i7 cpu... stick it to Microsoft..."

wait again...

Support those who support compeition, not monopolists who destroy it.

Wait, what? This is news? (1, Insightful)

jtownatpunk.net (245670) | more than 4 years ago | (#32351324)

A company that hasn't produced a discrete graphics card in over a decade (I'm pretty sure I remember seeing an Intel graphics card once. Back in the 90s.) is going to continue to not produce discrete graphics cards. Wow. Stop the presses. Has Ric Romero been alerted?

Re:Wait, what? This is news? (3, Insightful)

Anonymous Coward | more than 4 years ago | (#32351450)

A large, publicly announced project with a great deal of media hype that had the potential to shake up the industry was cancelled. So, yeah, stop the presses.

Re:Wait, what? This is news? (2, Informative)

kdekorte (8768) | more than 4 years ago | (#32351560)

The i740 [wikipedia.org] card.... great expections, poor real world experience.

Re:Wait, what? This is news? (1)

0123456 (636235) | more than 4 years ago | (#32351596)

The i740 [wikipedia.org] card.... great expections, poor real world experience.

Everyone I knew in the graphics business thought that Intel had gone completely insane with the i740; other companies were trying to cram more and more faster and faster RAM onto their cards while Intel were going to use slow system RAM over a snail-like AGP bus.

So I'd say the expectations were pretty low, at least among those who knew what they were talking about.

Re:Wait, what? This is news? (2, Informative)

TheRaven64 (641858) | more than 4 years ago | (#32351782)

To be fair to Intel, most graphics cards then were on the PCI bus, not AGP, so they didn't have the opportunity to use the host RAM except via a very slow mechanism. At the time, the amount of RAM was far more of a limitation than the speed, and a card using 8MB of host RAM via AGP was likely to have an advantage over a card with 4MB of local RAM on the PCI bus. While it was much slower than competing solutions, it was also much cheaper. The RAM on something like the VooDoo 2 was a significant proportion of the cost. A 740 cost about 20% of a VooDoo 2 and using system RAM had the advantage that you didn't have a load of RAM doing nothing while you were not doing 3D stuff. At the time the 740 was introduced, I had an 8MB VooDoo 2 and only 32MB of main memory. Having 8MB of RAM sitting doing nothing during the 90% of the time that I wasn't playing 3D games was a massive waste.

Re:Wait, what? This is news? (1)

0123456 (636235) | more than 4 years ago | (#32351964)

To be fair to Intel, most graphics cards then were on the PCI bus, not AGP, so they didn't have the opportunity to use the host RAM except via a very slow mechanism.

If by 'most' you mean 'Voodoo-2', yes. From what I remember all the cards I was using at the time Intel was trying to sell the i740 (Permedia-2, TNT, etc) were on the AGP bus.

I believe 3dfx were pretty much the last holdouts on PCI, because game developers had deliberately restricted their games to run well on Voodoo cards, thereby ensuring that they didn't need much bus bandwidth (any game which actually took advantage of AGP features so it ran well on a TNT but badly on a Voodoo was slated in reviews).

Re:Wait, what? This is news? (2, Insightful)

TheRaven64 (641858) | more than 4 years ago | (#32352346)

From what I remember all the cards I was using at the time Intel was trying to sell the i740 (Permedia-2, TNT, etc) were on the AGP bus.

Check the dates. The i740 was one of the very first cards to use AGP. Not sure about the Permedia-2, but the TNT was introduced six months after the i740 and cost significantly more (about four times as much, as I recall). It performed a lot better, but that wasn't really surprising.

Re:Wait, what? This is news? (1)

91degrees (207121) | more than 4 years ago | (#32351942)

Thing is, Intel were never in the same market as AMD and nVidia. Sure, nVidia had a few budget parts but that's not their main product. They're really making money from mid-range chips for PC gamers.

Anyone who would be satisfied with Intel would consider AMD and nVidia to be hopelessly expensive. Anyone who would consider paying for a decent graphics card would consider Intel chips to be worthless.

Re:Wait, what? This is news? (1)

hedwards (940851) | more than 4 years ago | (#32352228)

You mean ATI and nVidia. AMD only recently took over ATI and the AMD chips were budget friendly compared with the typically much more expensive chips that Intel was selling. I realize that you're technically correct, it just a tad misleading to suggest that Intel was competing with AMD over that time period when it was a completely different company.

Re:Wait, what? This is news? (1)

ElectricTurtle (1171201) | more than 4 years ago | (#32352546)

Considering that the context here is Larrabee which didn't publicly exist until after the AMD/ATI merger, the usage is appropriate. Further, that merger was half a decade ago. In the tech world that really doesn't count as 'recent' anymore. You really need to get over it and live in the present.

Re:Wait, what? This is news? (2, Interesting)

gman003 (1693318) | more than 4 years ago | (#32352094)

The Larrabee chips actually looked pretty good. There was a lot of hype, especially from Intel. They demoed things like Quake Wars running a custom real-time ray-tracing renderer at a pretty decent resolution. Being able to use even a partial x86 ISA for shaders would have been a massive improvement as well, both in capabilities and performance.

From what I've been able to piece together, the problem wasn't even the hardware, it was the drivers. Apparently, writing what amounts to a software renderer for OpenGL/DirectX that got good performance was beyond them.

Another part was an odd insistence on doing all the rendering in software, even stuff like texel lookup and blitting, but that's another story.

I wonder (1)

halestock (1750226) | more than 4 years ago | (#32351328)

Does this mean that they'll be focusing on continuous graphics instead?

Re:I wonder (1, Insightful)

Anonymous Coward | more than 4 years ago | (#32351498)

Does this mean that they'll be focusing on continuous graphics instead?

More directly, what the hell is "discrete graphics"? I've been working with computers since the TRS-80 days and this is the first time I've seen the term. The writeup makes it sound like "discrete graphics" is some revolutionary kind of new display method that will make our old idea of viewing "pixels" on a "screen" obsolete, but the Google tells me that "discrete graphics" just means a video card as opposed to an onboard chip. Call it a video card! (Or more precisely, the chip that would run on a video card as opposed to being integrated into the motherboard.)

A discrete component (4, Informative)

tepples (727027) | more than 4 years ago | (#32351676)

More directly, what the hell is "discrete graphics"?

It refers to a graphics processor as a separate (discrete) component of a computer system. A chip that does nothing but graphics can be more powerful than integrated graphics because the GPU circuitry doesn't have to share a die with the rest of the northbridge.

I think they mean (0)

Anonymous Coward | more than 4 years ago | (#32351686)

a discrete (meaning separate) card, as opposed to a graphics system that's integrated with the CPU's processing power.

So basically, like ATI and nVidia have been doing for years.

Please someone tell me if I'm wrong... TFA doesn't really define the term "discrete graphics card" very well so I'm just using my understanding of the word.

Re:I think they mean (1)

hedwards (940851) | more than 4 years ago | (#32352264)

It's discrete as in not continuous. At least I think that's how the term came to be.

Re:I think they mean (0)

Anonymous Coward | more than 4 years ago | (#32352452)

Yeah, those old-fashioned analogue graphics cards were great for picture fidelity, especially with gold-plated graphics cables.

Re:I wonder (3, Informative)

TheRaven64 (641858) | more than 4 years ago | (#32351810)

It's not a new term, and it's not unique to GPUs. The distinction between integrated and discrete coprocessors has been around for at least 25 years. If you read something like Byte from the early '90s, you will find discussions about the relative merits of integrated and discrete FPUs. You'll find a similar discussion on integrated and discrete MMUs and various other components if you look a few years earlier.

Re:I wonder (2, Funny)

bill_mcgonigle (4333) | more than 4 years ago | (#32352890)

If you read something like Byte from the early '90s, you will find discussions about the relative merits of integrated and discrete FPUs.

yikes, my memory of installing an 80387 has been completely un-accessed for at least a decade. Thanks for the scrub. :)

Re:I wonder (1)

networkBoy (774728) | more than 4 years ago | (#32353700)

and 80287 and 8087
meh
I miss those days some times.
then I use my computer to do something and I cease missing them.

Re:I wonder (1)

jandrese (485) | more than 4 years ago | (#32352842)

Discrete Graphics are the opposite of Integrated Graphics. Generally it refers to using a PCIe card to do the graphics instead of something built into your Northbridge (or whatever Intel calls their Northbridge now) or the CPU. Just as Integrated Graphics are synonymous with "crap", Discrete Graphics generally imply that you've got something at least somewhat capable under the hood.

Re:I wonder (0)

Anonymous Coward | more than 4 years ago | (#32351690)

Hillarious. I applaud you.

Both good and bad (3, Interesting)

TheRealQuestor (1750940) | more than 4 years ago | (#32351368)

This is bad news for one reason. Competition. There are only 2 major players in discreet graphics right now and that is horrible for the consumer. Now the good. Intel SUCKS at making gpus. I mean seriously. So either way Intel has no hope of making a 120 core GPU based off of x86 being cheap or fast enough to compete. Go big or stay at home. Intel stay at home.

Re: Discreet (1)

TaoPhoenix (980487) | more than 4 years ago | (#32351486)

I almost let this slide until you put the other half of the pun in capitals!

There is lots of tasty competition producing NSFW "Discreet Graphics" that Sucks!

Re:Both good and bad (1)

keeboo (724305) | more than 4 years ago | (#32351492)

This is bad news for one reason. Competition. There are only 2 major players in discreet graphics right now and that is horrible for the consumer.

What about VIA and Matrox?

Re:Both good and bad (1)

h4rr4r (612664) | more than 4 years ago | (#32351712)

They produce joke cards.

Re:Both good and bad (1)

ElectricTurtle (1171201) | more than 4 years ago | (#32351736)

He said "major players". VIA's IGPs are generations behind and only end up in 'budget' computers or embedded appliances. Matrox serves a niche market primarily focused on professional workstation rendering. Neither competes head to head with nVidia or AMD/ATI.

Re:Both good and bad (1, Interesting)

Anonymous Coward | more than 4 years ago | (#32351850)

What about them?

VIA's chips suck, like everything else they put out (e.g. their ridiculous re-badged Cyrix CPUs and stability-challenged motherboard chipsets).
Matrox is a niche player (multi-monitor etc.). Their performance is on level with Intel's GMA, and their prices are a lot higher than "free" which is what a GMA core basically costs when buying a modern Intel CPU or chipset.

Larrabee was indeed the only serious contender in discrete GFX we've seen for the better part of a decade or so, but it seems the Larrabee project management over-promised and under-delivered.
Intel corp. basically chose to scrap the project rather than suffer the embarrassment of a card that would probably have been one or two generations slower than NVIDIA and AMD's enthusiast parts, while using more power.

It would have been kick-ass for specialty graphics (truly fast vector graphics, voxel rendering, anything that doesn't conform to the standard vertex+effects shader pipeline of todays 3D graphics) and GPGPU though.
Shaders are a horrible kludge compared to just having a bunch of real CPUs with real memory and just running your algorithm like on a normal cluster.

Re:Both good and bad (3, Informative)

FreonTrip (694097) | more than 4 years ago | (#32352124)

VIA stopped designing motherboards for AMD and Intel CPUs about two years ago. Consequently, you can't find its GPUs in many places aside from embedded systems or ultra low-budget netbooks and the like. Weirdly they still sell a miniscule number of discrete cards, primarily overseas, but without divine intervention they'll never become a serious player again.

Matrox serves niche markets, mostly in the way of professional workstations, medical imaging equipment, and the odd sale of their TripleHead system to the ever-eroding hardcore PC gamer market.

In case anyone wonders what happened to the others: Oak Technologies' graphics division was acquired by ATI many moons ago; Rendition was eaten by Micron in 1998 and their name is now used to sell budget RAM; SiS bought Trident's graphics division, spun off their graphical company as XGI Technologies, had a series of disastrous product releases, and had their foundries bought by ATI, who let them continue to sell their unremarkable products to eager low-bidders; and 3dfx was mismanaged into oblivion a decade ago.

Re:Both good and bad (1)

Jeng (926980) | more than 4 years ago | (#32352732)

and 3dfx was mismanaged into oblivion a decade ago

And Nvidia picked up the pieces from 3dfx.

Re:Both good and bad (1)

FreonTrip (694097) | more than 4 years ago | (#32353272)

Right you are; sorry I forgot to put that in.

Re:Both good and bad (1)

FreonTrip (694097) | more than 4 years ago | (#32353298)

Right you are; I forgot to mention that. Thanks!

Re:Both good and bad (1)

Jackie_Chan_Fan (730745) | more than 4 years ago | (#32352382)

Nvidia has been incredible for the consumer for a long time now.

I would like to see their quadro products come down in price though. They are ridiculously overpriced.

Hrmmm. (1)

SLot (82781) | more than 4 years ago | (#32351446)

Doesn't bode well for the future of Project Offset.

Intel's NotToBee GPU (2, Funny)

julie-h (530222) | more than 4 years ago | (#32351890)

Actually Intel have changed the name to NotToBee.

Larrabee was a hedge anyway (5, Interesting)

Funk_dat69 (215898) | more than 4 years ago | (#32351904)

I kind of think Larrabee was a hedge.

If you think about it, around the time it was announced (very early on in development, which is not normal), you had a bunch of potentially scary things going on in the market.
Cell came out with a potentially disruptive design, Nvidia was gaining ground in the HPC market, OpenCL was being brought forth by Apple to request a standard in hybrid computing.

All of sudden it looked like maybe Intel was a little too far behind.

Solution: Announce a new design of their own to crush the competition! In Intel-land, sometimes the announcement is as big as the GA. Heck, the announcement of Itanium was enough to kill off a few architectures. They would announce Larrabee as a discrete graphics chip to get gamers to subsidize development and....profit!

Lucky for them, Cell never found a big enough market and Nvidia had a few missteps of their own. Also, Nehalem turned out to be successful. Add all that up, and it becomes kind of clear that Larrebee was no longer needed, negating the fact that it was a huge failure, performance-wise.

Intel is the only company that can afford such huge hedge bets. Looks like maybe another one is coming to attack the ARM threat. We'll see.

Fucking journalism (0)

Anonymous Coward | more than 4 years ago | (#32351952)

There's a difference between abandoning and postponing. But that would not be catchy anymore so you wouldn't make so much money on your website ads.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?