Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Early Ivy Bridge Benchmark: Graphics Performance Greatly Improved

Unknown Lamer posted more than 2 years ago | from the improves-with-age dept.

Intel 146

The folks over at Anandtech managed to spend some time with early Ivy bridge production samples and perform a few benchmarks. The skinny: CPU performance is mildly increased as expected, but the GPU is 20-50% faster than the Sandy Bridge GPU. Power consumption is also down about 30W under full load. The graphics, however, are still slower than AMD's Llano (but the Ivy Bridge CPU beats the pants off of the Fusion's). Is the tradeoff worth it?

cancel ×

146 comments

Sorry! There are no comments related to the filter you selected.

Tradeoff? (5, Insightful)

Sycraft-fu (314770) | more than 2 years ago | (#39275955)

It isn't meant to be powerful graphics. It isn't a "tradeoff". Intel's HD graphics are meant to be very low power, but competent enough to run basics, shiny OS features at least. That they do, and it sounds like IB is even better at that. But it isn't a "tradeoff" to get a good CPU with basic graphics that is called "normal". If you need good graphics discrete is still the way to go and there are plenty of reasonable options.

From the look of it, Ivy Bridge is quite a win. Sandy Bridge, but a bit better. Nothing not to like there.

Re:Tradeoff? (1)

tepples (727027) | more than 2 years ago | (#39275985)

If you need good graphics discrete is still the way to go

Do they have interchangeable discrete video cards for typical laptops yet?

Re:Tradeoff? (0)

Anonymous Coward | more than 2 years ago | (#39276051)

...they have for years; my consumer Dell from 2005 could do this; I upgraded it and a Sager from 2007 once each.

Depends on what you mean (4, Informative)

Sycraft-fu (314770) | more than 2 years ago | (#39276063)

So basically all laptops that have discrete graphics have it socketed in an nVidia MXM slot. Way cheaper to have one board and just knock cards on it for the manufacturers. However the thing is that since it is for OEMs and not consumers, it isn't as easy to swap as a PCI card. It is all on you to make sure the card you are getting is physically the right size, electrically something you system can handle, and thermally not to much.

Also pretty much only Sager actually supports doing it, and other laptop manufacturer will tell you to GTFO if you ask them about it. As such even finding the parts isn't easy.

With laptops you don't really upgrade much other than maybe the RAM or disk.

However the IB will be useful in laptops not only because it can give better performance for integrated only systems, but it'll be nice for switchable ones. You can get ATi card systems where you can manually switch between discrete and integrated and nVidia ones that do it on the fly. Better integrated graphics means you can use them for more things, so when on battery it is more feasible to use them and leave the discrete system shut down.

However note this wasn't a laptop part they are talking about, this is the desktop part.

Re:Depends on what you mean (1)

rhook (943951) | more than 2 years ago | (#39276791)

99% of laptops that have discrete graphics have the GPU soldered to the mainboard.

Re:Depends on what you mean (1)

Creepy (93888) | more than 2 years ago | (#39277663)

And when they do have MXM slots, those are sometimes tied to manufacturer drivers and customized hardware that will only start if it sees the manufacturer's hardware. Some people have said drivers from laptopvideo2go and such work, but I haven't tried it (my last two laptops both had soldered on discrete graphics).

Not anymore (2)

Sycraft-fu (314770) | more than 2 years ago | (#39277735)

Crack them open some time. Slots are the big thing since it keeps production costs down.

Re:Tradeoff? (1)

JoshRosenbaum (841551) | more than 2 years ago | (#39276593)

I've always wondered why there isn't some kind of expansion port standard for video cards on laptops. Let me plugin a video card black box onto the side of my laptop! I don't care if I need a power adapter for the video card box. That way I can use the normal onboard graphics as needed, but occasionally, when I want to game, I can just plugin my video card box, turn on my laptop, and the laptop will automatically switch to using it for graphics. Heck, maybe the port could be PCI express (without power if needed), that way it could have other uses as well.

Anybody more familiar with this issue (hardware or market) have any thoughts on the feasibility of this? Anybody know why something like this hasn't been done?

Possible reasons on the top of my head:
Laptop makers want a user to buy a whole new laptop when it is "slow".
Hardware issue.
Most users wouldn't use it and we only cater to the top X% of people.

Re:Tradeoff? (2)

spire3661 (1038968) | more than 2 years ago | (#39276651)

Thunderbolt will do this. There are prototype thunderbolt GPU enclosures out there now. We'll start seeing them soon, hopefully.

Re:Tradeoff? (0)

Anonymous Coward | more than 2 years ago | (#39276975)

Thunderbolt is slower then pci-e x4 and use a video card will max out the bus as well.

Re:Tradeoff? (1)

voidptr (609) | more than 2 years ago | (#39276725)

Thunderbolt is essentially external PCIe, and there are a few external PCIe enclosures now designed for this use so you can attach a better graphics card to a MacBook Pro or Air when you're at your desk.

Re:Tradeoff? (1)

RatPh!nk (216977) | more than 2 years ago | (#39276727)

There is the promise of the with thunderbolt but latency is an issue. However I believe there is one ( at least) on the market here. [extremetech.com]

Re:Tradeoff? (1)

subreality (157447) | more than 2 years ago | (#39276743)

For external cards, you're looking for Thunderbolt. I have high hopes for it.

Internal cards are caught between all the factors you mentioned plus the very limited internal space. Laptop manufacturers don't have much incentive to reserve a large volume for an aftermarket upgrade that most users will never be interested in. It's a niche someone might eventually cater to, but don't hold your breath.

Re:Tradeoff? (4, Informative)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39276805)

There have been a few stabs at it, I think both ATI and Nvidia have released more-or-less-orphaned-on-launch partnerships with some laptop outfit or other, using proprietary cabling.

My understanding is that there are a few major hurdles:

Historically, there really haven't been any good standardized high-bandwidth interfaces to the outside world on laptops. The proprietary docking station port, if provided, might connect directly to the PCI bus; but your next best bets were relatively lousy things like PCMCIA or USB. Even with PCIe, you get 1x from an expresscard slot; but the standards for external cabling for anything beefier than that have been languishing in the PCIe SIG forever...

Unless you are content to use an external monitor only, an 'expansion GPU' both has to have access to all the usual bandwidth that a GPU requires and have enough bandwidth(and suitable software/firmware cooperation) to dump its framebuffer back to whatever internal GPU is driving the laptop screen. You can get(albeit at unattractive prices) enclosures that connect to the 1xPCIe lane in an expresscard slot and break that out into a mechanically 16xPCIe card enclosure with supplemental power. Assuming the BIOS isn't a clusterfuck, you can pop in an expansion card just as you would on a desktop. That only gets you the video outs, though, it doesn't solve the trickier and more system-specific problem of driving the laptop screen.

Docking stations: At present, laptop manufacturers get to designate one line as 'enterprise' by including the necessary connector, and then charge a stiff fee for the proprietary docking station as your only option to drive a few extra heads. I imagine that this blunts the enthusiasm of the major enterprise laptop players for a well-standardized and high bandwidth external connector.

Re:Tradeoff? (1)

CastrTroy (595695) | more than 2 years ago | (#39277101)

If they took out the optical drive, you would have plenty of space for a pretty spiffy graphics card. I'm sure the majority of people would rather have a really nice interchangeable graphics card on their laptop than an optical drive. If you really need an optical drive, there's always an option for USB. Sure there's things like ThunderBolt, which is basically External PCIe, but I think it would make much more sense to just leave the graphics card in the main laptop body, and do away with the optical drive.

Re:Tradeoff? (1)

drsmithy (35869) | more than 2 years ago | (#39278895)

I've always wondered why there isn't some kind of expansion port standard for video cards on laptops.

There is: ExpressCard and Thunderbolt.

The reason you don't see anyone actually doing it is because serious customer demand for upgradeable GPUs in laptops is, for all intents and purposes, nonexistent.

Re:Tradeoff? (0)

Anonymous Coward | more than 2 years ago | (#39276563)

Correct. However it still won't matter to the Intel fanboys who want AMD to die in a fire, despite the fact that the market continuously needs competition in the CPU-GPU arena.

AMD could come out with a price-to-performance Intel crushing game changer tomorrow, and the fan-boys would still be damning them until New Years.

And no. This isn't an AMD fan-boy rant. Just a review that historically these things have gone in cycles. Right now, Intel is on top in most specific arenas.

Re:Tradeoff? (-1)

Anonymous Coward | more than 2 years ago | (#39276923)

Amd has an history of bugged cpu at launch and bad price performance ratio on non bleeding edge cpus, and here's why buyers are cautios about them.

And i'd add thermal issues on high end laptop parts.

Seriously, there is no reason at all to go amd right now.

(i wasn't surprised by dragonfly guys finding a bug in their last cpu either)

Conpetition is not an excuse to buy from a sloppy manufacturer. They should compete on merit, not rely on charity hoping for a better time

Re:Tradeoff? (2)

Luckyo (1726890) | more than 2 years ago | (#39276961)

Thing is, many people like games. And games are demanding. Llano and brazos allow playing mainstream 3D (as in not angry birds/solitaire) games at low settings.
Sandy/Ivy bridge and atom on the other hand are utterly useless for that. They can run aero and give very low end support to video decoding in hardware, and that's pretty much it.

So if you're buying a machine where you intend to actually use that GPU for anything more graphically intensive then aero, intel is simply not an option unless you're also getting a discreet graphics card. So yes, it's a tradeoff, a very significant one for some and insignificant for others. It depends on needs of the user.

A great example is my mom, who loves her atom netbook. She doesn't even run aero in w7. On the other hand, I have a brazos laptop that can run starcraft 2 on low/medium for several hours off battery, which is simply unmatched by any intel laptop on the market.

Re:Tradeoff? (1)

tepples (727027) | more than 2 years ago | (#39277127)

Thing is, many people like games. And games are demanding.

Indie games tend not to be quite as demanding due to the cost of producing detailed assets, and mainstream games tend to be ported to consoles. So a lot of people will buy a homework-and-Facebook PC with integrated graphics and buy a console for those games that won't run on a homework-and-Facebook PC.

Re:Tradeoff? (1)

Luckyo (1726890) | more than 2 years ago | (#39278493)

Gamers who only play indie games are an extremely small minority, likely below single digit in terms of percentage. Most people who play indie games also play non-indie games.

Re:Tradeoff? (1)

tepples (727027) | more than 2 years ago | (#39279311)

Most people who play indie games also play non-indie games.

And they have the PC with a GMA for homework, Facebook, and indie games, and the console for major label games.

Re:Tradeoff? (0)

Anonymous Coward | more than 2 years ago | (#39277723)

if you look at the benchmarks, ivy bridge is nowhere near useless - it's actually pretty good if you can bear lower resolution and/or details. in the age of console ports, it actually copes pretty well.

Re:Tradeoff? (1)

Luckyo (1726890) | more than 2 years ago | (#39278611)

Having looked at them, I stand by my opinion. Even the highest end available, HD4000 loses to comparable AMD offering by around third to half. This not even counting the cheating in filtering tests (which apparently was reduced).

For example, in my book, SC2 is barely playable on AMD offering. Losing third to half FPS takes it quite far into unplayable territory.

Re:Tradeoff? (2)

b0bby (201198) | more than 2 years ago | (#39277145)

My reading was that the tradeoff was between Intel's more powerful CPU/less powerful GPU, and AMD's more powerful GPU/less powerful CPU offerings. In that case there is a real tradeoff - you can't get both the more powerful CPU & GPU in one package.

Re:Tradeoff? (2)

TheNinjaroach (878876) | more than 2 years ago | (#39277729)

I thought the tradeoff mentioned in the summary was with regards to Intel vs AMD: You get better graphics with the integrated AMD Fusion chips, but poorer CPU performance.

In other news, I bought one of the new AMD 6-core FX processors. Despite the miserable benchmarks, that thing feels faster than any other CPU I've had the privilege of using.

Re:Tradeoff? (1)

Anthony Mouse (1927662) | more than 2 years ago | (#39279151)

In other news, I bought one of the new AMD 6-core FX processors. Despite the miserable benchmarks, that thing feels faster than any other CPU I've had the privilege of using.

Yeah, AMD's marketing department is full of fail. They were telling everyone "50% faster than Sandy Bridge" and giving people high expectations, so after the benchmarks came out instead of people thinking "meh, it's OK" everybody was running around predicting the end of the world.

On top of that, they sent reviewers the FX-8150, the 8-thread version with the worst single thread performance per dollar because you're paying for eight threads whether you use them or not. So the reviewers compared it to the Intel chip that cost the same amount (i5-2500k) and it looks like crap on anything single-threaded. Meanwhile the FX-4170 has better single thread performance and is $100 cheaper.

Re:Tradeoff? (4, Interesting)

hairyfeet (841228) | more than 2 years ago | (#39278077)

But it IS a tradeoff Blanche, it is. You see most folks are embracing the wonder that is "The Internet" and all the TV movies and other entertainment that this wonderful medium has to offer and Intel GPUs...well they suck REALLY hard.

But here is the dirty little secret AMD knows that Intel doesn't want you to hear, going so far as to shoot their Atom division in the face by killing off the Nvidia chipset business and hobbling Atom with insanely shitty rules like "Only 10 inches with crappy resolution" and "Only 2Gb of RAM" and the secret is this...Most folks simply aren't slamming even 5 year old chips hard enough to worry about, much less the newer ones. You see chips passed "Good enough" for the vast majority once we hit dual cores, so the fact that AMD's chips are 30% slower really doesn't matter if the user is only using less than half the power available anyway. And having that really nice GPU makes everything nice and smooth, with great HD video and even gaming if you so desire, although the majority isn't playing heavy CPU slamming games but crap like Farmville and Mob Wars

This is why both my desktop and netbook are AMD and I sold my full size for the netbook because i found when I was mobile I simply wasn't hitting the CPU hard enough to matter. My Thuban X6 has OCing headroom up the wazoo should I ever need it but with most games barely hitting dual cores and transcoding on 6 cores being so sweet i doubt I'll need it and the E350? Man whomever designed that chip needs to be given a Corvette and a raise by AMD because that thing is bloody brilliant! 6 hours playing HD video at default voltages (BTW if you have an E or C series check out Brazos Tweaker [google.com] as you can add 20%-30% battery life by using it) and the ability to just pop in an HDMI cable for full 1080p goodness, hell it even plays L4D, Bioshock II, and GTA:VC (I could play the newer GTA games but I don't care for them) and all while staying cool to the touch and quiet as a churchmouse. The OEMs have taken notice (now that Intel isn't bribing them not to anymore) and you can see everything from HTPCs to laptops and netbooks to all in ones and desktops running Brazos. In fact last time I walked into my local Walmart Supercenter there were only 2 Intel units, both of which were bottom o' the line Atoms, the rest of the store? All AMD Fusions. I've built several office boxes (the traditional stronghold of Intel) with the E350 and the employees just love them, whisper quiet while giving them plenty of power for their everyday tasks.

Where Intel screwed the pooch was being too greedy. they SHOULD have made a deal with Nvidia to ensure plenty of new GPUs for their chips and instead they wiped out the entire Nvidia chipset business and made many of their chips simply overpriced and underperforming, especially in the laptop arena where you can't add a discrete. ION and Optimus was the perfect answer, with the low power shitty Intel chip for when you were on battery and the Nvidia chip when you were plugged in but now that the option is gone frankly I wouldn't touch Intel on a mobile unless it had a discrete and i warn my customers of the same. As we get more and more multimedia heavy folks want good graphics with smooth video and nice gaming and intel just don't have that. You can buy an AMD A series for probably half of what this chip is gonna cost, an E series for like one fifth, and while you won't notice the CPU unless you are doing number crunching or some other task one doesn't do on a mobile often you WILL notice the much nicer graphics. Intel just went the wrong direction on this IMHO and will pay the price.

But still slower then a "real" video card... (4, Interesting)

Kenja (541830) | more than 2 years ago | (#39275961)

Frankly, I am sick and tired of these integrated GPUs. The theory is that its a cost saver, but since I just put in a dedicated graphics card it ends up being a cost with no benefit. Ah well.

Re:But still slower then a "real" video card... (2)

tepples (727027) | more than 2 years ago | (#39276003)

People who don't use a computer other than for homework and Facebook don't feel the need to put in a dedicated graphics card. As I understand it, as long as Office and YouTube work, non-gamers and console gamers are perfectly fine if a computer's 3D capability is comparable to a Voodoo3 from over a decade ago.

Re:But still slower then a "real" video card... (1)

Anonymous Coward | more than 2 years ago | (#39276079)

Even an ION board blows the doors off a Voodoo from a decade ago. 32 bit full precision calculations, with programmable shaders. Voodoo cards were fixed function "game accelerators" that cheated with dithering. No GPGPU applications there - which is real story behind these things, and where personal computing is moving to.

Even with a seperate discrete card, that "unused" GPU is available for number crunching, and is pretty damned good at it.

Which ION based desktop PC? (1)

tepples (727027) | more than 2 years ago | (#39276215)

Even an ION board blows the doors off a Voodoo from a decade ago.

I've noticed that Best Buy doesn't have the ION-based Aspire Revo PC anymore. Is the EeeBox any good for those who choose to buy rather than build? And I thought NVIDIA had got out of the Intel chipset business anyway [slashdot.org] due to patent licensing squabbles.

Re:Which ION based desktop PC? (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39276397)

There is 'ION' and 'ION2':

The former is an intel Atom, running in the GTL mode that allows it to pair with core logic designed for P4s, using an Nvidia chipset instead of the bottom-barrel GMA-950-hobbled Intel chipsets that were the cheap(but not especially power-efficient or high-performance) thing to pair with Atom parts.

With the later Atom revisions, Intel moved to a new chipset interface that they assert Nvidia has to right to interface with(they moved to a different, also Nvidia-disallowed, interface for desktop and server/workstation parts). Here, with 'ION2', it's an intel CPU and chipset, combined with Nvidia's lowest-end discrete GPU.

Not much Nvidia can do about it; but ION actually became a suckier value proposition in its second revision. The first was worse only in that it was slightly more expensive. The second cost more and required more board space and power(because it wasn't replacing an inferior part with a better one; but adding an additional GPU, and possibly some RAM, since access to system RAM was pitifully slow for the outboard GPU).

Re:But still slower then a "real" video card... (1)

Guppy (12314) | more than 2 years ago | (#39276475)

Voodoo from a decade ago.

Here's a link to a benchmark [guru3d.com] of 3dfx's never-released Voodoo5-6000 (modified and overclocked).

3dfx Voodoo5 6000 3700A Gold SE @201 MHz ( 3dmark2001se ): 6341 marks

Looks like you're right. An integrated nVidia ION does indeed beat it in benchmarks: http://hwbot.org/hardware/videocard/nvidia_ion_integrated/ [hwbot.org]

Re:But still slower then a "real" video card... (2)

Sir_Sri (199544) | more than 2 years ago | (#39276139)

These are a hell of a lot better than that. They aren't good, but they will manage to play skyrim for example (albeit at a relatively low resolution for a decent framerate). http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/15

These are probably the wrong direction for the product though. They don't viably compete with a discrete GPU, so people who can, would rather not have to buy an integrated GPU at all, and for business it's so powerful it's letting employees game on work computers, which isn't good.

On the other hand, for the home user market, it's good enough that people can do whatever they want on the machines and they'll manage.

Re:But still slower then a "real" video card... (1)

tepples (727027) | more than 2 years ago | (#39276365)

They aren't good, but they will manage to play skyrim for example (albeit at a relatively low resolution for a decent framerate).

Just in case someone doesn't care to click through to the benchmark, allow me to summarize: AnandTech reports 46 fps at 720p and no AA for Skyrim, a PC game comparable to PS3 games. So no, using integrated graphics doesn't mean going back to Dreamcast-class graphics anymore.

Re:But still slower then a "real" video card... (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39276419)

Arguably, any business whose strategy to prevent staff gaming involves the GMA950, rather than software administration or hiring responsible people deserves the huge number of games of Farmville and Snood currently being played on their network...

Re:But still slower then a "real" video card... (1)

DigiShaman (671371) | more than 2 years ago | (#39277121)

My wife's Dell Inspiron has an i3 CPU with integrated Intel HD graphics. Historically integrated graphics have sucked. But her laptop manages to sail on through 2D like a hot knife through butter. Facebook games (Flash anything really) and DirectX 2D games such as Bejeweled 3 run silky smooth at maximum resolution. No frame rate dropping what-so-ever.

As a PC gamer, I'm starting to wonder if the PC platform will be viable for hardcore gaming in the future. At the least, there will be a slow period until Intel can catch up in the future with what nVidia and AMD can offer now. I only ask because the current offering of Intel's integrated graphics are good enough for most people. That translates into less dedicated graphic sales for the other guys which in turn means less money in R&D. That can only mean that the cost of dedicated graphics will go up while the expected increase in performance and features from R&D expenditures goes down. It doesn't look good from my gaming perspective.

Re:But still slower then a "real" video card... (1)

Sir_Sri (199544) | more than 2 years ago | (#39280621)

Come back again in a console generation.

Right now you can't do enough better on a PC to warrant doing major PC exclusive graphics compared to the 360/PS3. Higher resolution, better FPS sure, but not significantly better. Intel is reasonably closing the gap on PS3/Xbox 2 level performance, but that puts then at about 0.1 of a good graphics card*.

Of course the 'next gen' consoles are in the making now, and that means we'll see consoles about on par with what you can do with a decent rig today. So then the next cross platform titles (which will be Xbox 3, PS4, PC) won't run anywhere near viably on intel integrated graphics.

Sure, intel just wiped out their competition in the low end casual/non gamer market, but that was never a great market anyway.

*Yes high end GPU's are easily 10x the PS3/Xbox2 GPU's, but you have a lot more going on on the PC, which is a certain degree of inefficiency, and you have high resolutions and framerates, and some visual candy that isn't on the consoles which takes up the power.

Re:But still slower then a "real" video card... (1)

DigiShaman (671371) | more than 2 years ago | (#39280851)

Technically, that would seem to be about right more or less. But the one aspect of PC gaming that I truly love is a personalized seated gaming experience with precision mouse movement and a keyboard. It's all about the venue. Just as I would never play platformers and fighting games on the PC, I would never play an MMORPG on a console. Two different genres targeted and optimized for two different play styles.

As I get older, I'm less likely to play games on a console. I'm just not into the whole twitch fest anymore. Although I will console games casually with friends for that who social interaction.

Re:But still slower then a "real" video card... (1)

dubbreak (623656) | more than 2 years ago | (#39276207)

People who only use their computer for homework and Facebook don't need a Sandy or Ivy Bridge processor. A Core2 with GMA is more than sufficient.

I think it makes sense for mobile applications, but for desktop it doesn't. You can get a $40 card that will outperform the onboard. That being said I'm sure Dell etc love it. They love charging for upgrades. They're the car salesmen of the computer world. Once you add the goodies onto your base model you could have bought the top end that came with those features and more.

Re:But still slower then a "real" video card... (2)

timeOday (582209) | more than 2 years ago | (#39276685)

You can get a $40 card that will outperform the onboard.

True yesterday, false today: [anandtech.com]

Based on what we've seen, discrete GPUs below the $50 - $60 mark don't make sense if you've got Intel's HD 4000 inside your system. The discrete market above $100 remains fairly safe however.

Re:But still slower then a "real" video card... (1)

Gaygirlie (1657131) | more than 2 years ago | (#39276335)

The sad thing though is that us gamers/enthusiasts are basically paying for a GPU we'll never use. It would be nice if these CPUs were sold also without an integrated GPU.

Re:But still slower then a "real" video card... (2)

halfEvilTech (1171369) | more than 2 years ago | (#39276433)

The sad thing though is that us gamers/enthusiasts are basically paying for a GPU we'll never use. It would be nice if these CPUs were sold also without an integrated GPU.

They actually do exist, check out the Core i5 2550 for example. It has a higher clock than the 2500 for the same price. The difference is they removed the iGPU from the chip.

Re:But still slower then a "real" video card... (1)

Kjella (173770) | more than 2 years ago | (#39276609)

Well, it's not like Intel does it just to annoy you. The top Intel chips have 16 EUs which is roughly equal to 32 shaders. A top graphics card like the 7970 has 2048 shaders. So if you use AMD's $450 price as basis that works out to about $7 for an Intel, make it $10 to include QuickSync and whatnot. For that small savings Intel would have to validate a new design and risk a potential shortage of chips with/without IGP. Look at the die layout [pcper.com] for Sandy Bridge, there's no Ivy Bridge layout yet but it's probably the same. You see that huge chunk called "graphics"? Me neither, it's somewhere in those small "misc io" bits. That's the only little thing of your CPU you aren't using with a dGPU.

Re:But still slower then a "real" video card... (5, Informative)

Halo1 (136547) | more than 2 years ago | (#39276769)

Look at the die layout [pcper.com] for Sandy Bridge, there's no Ivy Bridge layout yet but it's probably the same. You see that huge chunk called "graphics"? Me neither, it's somewhere in those small "misc io" bits. That's the only little thing of your CPU you aren't using with a dGPU.

I guess that's simply a chip without an integrated GPU. Here's a picture of a Sandy Bridge Core i7 with GPU [pcmag.com] .

Re:But still slower then a "real" video card... (0)

Anonymous Coward | more than 2 years ago | (#39277485)

There are multiple products using the sandy bridge core.
GPU or not, more cores, less cores, etc.....

Re:But still slower then a "real" video card... (4, Insightful)

timeOday (582209) | more than 2 years ago | (#39276757)

I would like to see the ability to use the integrated GPU, even if not for graphics. The traditional CPU is good for sequential logic. But for pattern recognition, physics simulations (which is basically what 3d graphics is), encoding, or code-cracking (e.g. bitcoin), the highly parallel structure of the GPU is better. Now you might argue, my offboard GPU is still the same thing, but better. OK. But these are inherently parallel tasks, so if you could use the one built-in AND the add-on, you wouldn't be wasting anything.

Re:But still slower then a "real" video card... (1)

UnknowingFool (672806) | more than 2 years ago | (#39276103)

Well then it sounds like Ivy Bridge is your best bet over Llano. For most people, the latest generation of integrated graphics is good enough for even 1080p video. It can even work for casual gaming.

Re:But still slower then a "real" video card... (1)

Shadow99_1 (86250) | more than 2 years ago | (#39276177)

Well even for you one of the advantages of the AMD Fusion platform is the ability to add in a discrete card and combine the power of the two (Figuring the discrete card is another AMD and your running under windows). Though from what I've seen the Fusion platform is capable enough for most 3D tasks unless your serious gamer who wants every bell and whistle @ 1600x1200+.

It's also a different story when it comes to laptops. Fusion is incredibly useful in the laptop market where the entire lower end of the market uses integrated video. So if you want good 3D performance in a laptop a Fusion based system or a expensive non-integrated gpu is the only real options.

Doesn't cost much (2)

Sycraft-fu (314770) | more than 2 years ago | (#39276191)

And there is a lot of use for them.

In terms of desktop chips it is for low end use. A lot of people just do web/e-mail/word with their systems and an Intel HD graphics setup is perfect for them. It is plenty of power to do the shiny OS interface, accelerate video, and so on, and comes with the system.

In terms of laptop chips, you really always want it on account of switchable graphics. If your laptop has switchable graphics it can use the integrated for low power consumption and only kick on the discrete when needed. For ATi cards it is a bit clunky, you have to actually manually switch it, but you can do that just use integrated on battery, discrete on power. For nVidia they have a thing called Optimus where it all happens in realtime without you noticing. You'll be on integrated on the desktop then you fire up something intensive and it switches over seamlessly.

Regardless, they are widely used and so worth including. It would cost more for Intel to make a second variant of the chip without them.

Re:Doesn't cost much (1)

ifiwereasculptor (1870574) | more than 2 years ago | (#39276327)

Not really. See Athlon II X4 631. It costs quite a bit less than the A6-3650. Not much, but enough for GP to have a point. Why pay even little for something that you're not going to use at all?

Re:Doesn't cost much (1)

Sycraft-fu (314770) | more than 2 years ago | (#39276435)

The Lano is a different beast. AMD is trying to whack a bigass graphics card, relatively speaking, on there. Intel's HD graphics are tiny, that is why they are low performers. But sure if he really wants Intel processors with no graphics he can have them. Intel's high end CPUs don't feature them, their LGA 2011 and LGA 1366 CPUs. However, they cost more so there you go.

Intel's mainstream CPUs have integrated GPUs. They are very reasonably priced so just deal with it.

Re:Doesn't cost much (1)

ifiwereasculptor (1870574) | more than 2 years ago | (#39277059)

"Deal with it"? I though I was doing that just fine. I'm not losing my shit here, suing Intel or even registering a IHATEINTELGPUS.com. All I did was point out that buying something you'll never use is a bad deal, regardless of price.

I think AMD had the right idea: bundle a good GPU, strong enough to beat discrete graphics of past generations or it's pointless. And allow crossfire in case the user wants to upgrade, so as not to waste any resources. As for Intel, if I already have a Radeon HD5570 or Geforce 440 (quite old cards, mind you), the bundled GPU means squat to me.

Plus Llano has much more balanced GPU/CPU prowess. With Intel, you'll always be held back by poor GPU performance, so a discrete adaptor is a must for, say, games. And gamers are a huge portion of the market. Especially budget gamers, when you consider markets such as India, China, Russia, the whole latin America etc. In such markets, due to import taxes and whatnot, a $10 difference is much more significant.

That's what I'm saying. A poor, half-assed GPU adaptor bundled in every processor is pretty pointless. They are bad for you whether you need no GPU or a so-so GPU, benefitting only those who specifically need a bad-to-so-so GPU. I suppose you could argue that that's what HTPCs need, but they just go ahead and bundle it with the i5 and i7, processors that are very unlikely to be used in HTPCs. It's not like Intel isn't bound to have a few GPU duds they could sell as i3-2099s, i5-2499s or something.

Re:Doesn't cost much (1)

Thavilden (1613435) | more than 2 years ago | (#39276567)

I want to add home theater PCs to the list of good uses of integrated graphics. My current HTPC has a 1.5 year old Core i5, whatever the cheapest I could get at the time, and it handles 1080p with audio resampling out over HDMI with no problem. Flash is no problem either. Not having to have the discrete graphics card is a huge benefit that allows me to use slim cases for a set-top box feel.

Re:Doesn't cost much (0)

Anonymous Coward | more than 2 years ago | (#39276865)

I'm currently using a core I3-2100 for my HTPC. It cheeses me off that Intel hasn't fixed the lack of VSync in the video driver in Linux for over a year now. There is nothing you can do that will reliably fix it. Some people have done magic voodoo and it fixed it for them but nothing has worked for me. Intel also needs to add 23.976, and 29.97 Hz outputs. The current video systems can't do that and it's annoying.

Re:Doesn't cost much (1)

ifiwereasculptor (1870574) | more than 2 years ago | (#39277171)

Isn't an i5 overkill for an HTPC? Unless you often reencode your media, a Pentium G620 would probably work just as well.

Re:Doesn't cost much (1)

Thavilden (1613435) | more than 2 years ago | (#39280723)

G620 was released a year after I bought that i5. To be honest though, I didn't look at Pentiums at all. If it had HD graphics and would handle HDMI, then I suppose I should have looked!

Faster than low end Card. (1)

guidryp (702488) | more than 2 years ago | (#39276381)

http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/11 [anandtech.com]

It was faster than low end cheapo cards. Which is mainly the point.

If you are putting in $200 cards, they are a long ways off, but they essentially obsolete the need for a low end card, which is a good thing.

And since all most people need is a low end card, this is sufficient for most people.

For desktop, internet, video, web games, older games and even new games at modest settings this is fine.

Re:But still slower then a "real" video card... (1)

iamhassi (659463) | more than 2 years ago | (#39276447)

Frankly, I am sick and tired of these integrated GPUs. The theory is that its a cost saver, but since I just put in a dedicated graphics card it ends up being a cost with no benefit. Ah well.

According to this review, the AMD A8-3850 is 50% faster than the ~$50 Radeon 6450, but 50% slower than the ~$75 Radeon 6670. [pcper.com]

So sure, it's not better than a $200 dedicated card, but it's far better than what integrated cards use to be like. Integrated will never be faster than dedicated, but if I can get about 50% of the performance from integrated then that's reasonable until I have an extra $200 for a "real" video card.

Re:But still slower then a "real" video card... (1)

ifiwereasculptor (1870574) | more than 2 years ago | (#39277239)

The A8 is way faster than Intel's offerings, works with a 6670 in crossfire so as not to diminish its value when upgrading and can be bought without a GPU as an Athlon II X4 641. I think he was referring to the fact that going Intel forces you to pay for a GPU that you'll have no use for if you have anything better than a Radeon HD5570, which is often the case, especially with processors more powerful than, say an i3.

Re:But still slower then a "real" video card... (2)

Bengie (1121981) | more than 2 years ago | (#39276557)

Having a graphics card integrated into the CPU is only one benefit. The future benefit is using the GPU as a co-CPU. AMD already has plans for the IGP to understand context switching and respect protected memory.

Some people say "why, the IGP is slower than discrete". But no one thinks, ohh, the IGP has 2-3 magnitudes less latency than a discrete GPU while being less than 1 magnitude slower.

Think of future multimedia where the CPU and IGP ping-pong data back and forth. I like to think of what kind of physics game may have once IGPs become easy to program. CPU->GPU->CPU->GPU is really slow when you're talking about microseconds round trips each hop. CPU->IGP->CPU->GPU is much faster when latency between the CPU and IGP is in nanoseconds. It may possibly even get streamlined to CPU->IGP->GPU, depending on future algorithms and engine designs.

Re:But still slower then a "real" video card... (1)

UnknowingFool (672806) | more than 2 years ago | (#39276693)

The latest benchmark shows that the integrated graphics are better than the budget discrete cards currently offered. For uses where graphics performance does not matter as much (business desktops and laptops) this is a cost savings for them. Also the current trend today is ultrabooks which benefit greatly from not having a discrete card.

Re:But still slower then a "real" video card... (1)

durrr (1316311) | more than 2 years ago | (#39276789)

Where they really shine is when combined with a mini ITX mobo. Now if I got around to get an inverter and some decent battery I could bring my desktop computer with me as a moderately bulky laptop replacement.

Re:But still slower then a "real" video card... (1)

lordmetroid (708723) | more than 2 years ago | (#39276869)

I will buy an Ivy Bridge with HD4000 and sell my dedicated graphics card.

Re:But still slower then a "real" video card... (1)

blackC0pter (1013737) | more than 2 years ago | (#39277331)

I use the integrated GPU for a 4th monitor. I drive 3 displays from a discrete card and then the 4th from the CPU. It saves the cost, hassle, power, weird driver issues, from putting in two graphics cards. Yes, I could go the route of a displayport hub but those are expensive and it limits the throughput of the monitors. I'm not going to be gaming on the 4th monitor but do want to display more info.

Re:But still slower then a "real" video card... (1)

Terrasque (796014) | more than 2 years ago | (#39279185)

Actually, there is one place where Intel's integrated GPU knocks the socks off all the competition... Video encoding!

Just look at the benchmarks and image examples from AnandTech's review [anandtech.com] .

And that's the old Sandy Bridge. If we see 30%-50% improvement over that again.. I can see some uses for the integrated card :)

Re:But still slower then a "real" video card... (1)

jayesel (2531026) | more than 2 years ago | (#39280655)

The fact the GPU is integrated, means blindingly fast transfer rates between the processors. Most likely in the THz region, which is awesome. You should see some gains, and expect this type of arch to dominate procs in the future. I also suspect that with the advent and continuing advances in graphene nano tech, things should start getting interesting very fast, and at lower power consumption to boot.

GPU Performance (1)

alanthenerd (639252) | more than 2 years ago | (#39276005)

Does GPU performance matter? Are people using integrated graphics for graphics intensive things? It seems to me that integrated graphics performance has more than exceeded that required for normal desktop graphics for a while and anyone who is doing anything seriously graphically intensive is using a discrete graphics card. Am I wrong?

Re:GPU Performance (1)

Anonymous Coward | more than 2 years ago | (#39276141)

It matters if your application has any type of OpenCL accelerated calculations. I know, who does that? Same could be said about SSEx or MMX when it first came out - who uses that??

Re:GPU Performance (3, Insightful)

imbusy (1002705) | more than 2 years ago | (#39276285)

The beauty of having an on-chip GPU is that you don't have to move data around to do computations with OpenCL. It's something that kills the benefits of using a dedicated graphics cards for almost every GPGPU application. The 10-100x speed-ups are a lie.

Re:GPU Performance (1)

Bengie (1121981) | more than 2 years ago | (#39276605)

^ This

Not all work loads need lots of throughput, some are very sensitive to latency. IGP is a great trade-off between latency and throughput. 10s of times faster at SIMD than a CPU could ever be and 100-1000 times less latency than a discrete GPU.

Re:GPU Performance (5, Insightful)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39276247)

The main reason that integrated GPU performance matters(aside from the fact that it is all the GPU you get in any too-cheap or too-skinny device that doesn't have a discrete option) is that it defines the (overwhelmingly common) baseline for what 'PC graphics' means. If that situation is uniformly awful, GPU intensive stuff will continue to be fairly niche, which leads to a chicken-and-egg issue: if integrated graphics suck, the market for GPU intensive stuff will be constrained, which will reduce the incentive to improve GPU performance, and so it goes...

Re:GPU Performance (1)

rsborg (111459) | more than 2 years ago | (#39277707)

The main reason that integrated GPU performance matters(aside from the fact that it is all the GPU you get in any too-cheap or too-skinny device that doesn't have a discrete option) is that it defines the (overwhelmingly common) baseline for what 'PC graphics' means. If that situation is uniformly awful, GPU intensive stuff will continue to be fairly niche, which leads to a chicken-and-egg issue: if integrated graphics suck, the market for GPU intensive stuff will be constrained, which will reduce the incentive to improve GPU performance, and so it goes...

And this is exactly how Intel wants it - any level playing field that emphasizes GPU would have them at the mercy of ATI/NVidia as Intel's previous efforts at a competetive GPU (see Larabee) were pretty dismal.

However, with the rise of mobile devices (iOS, Android) and ARM (even Microsoft is targeting ARM for Win8), they are cooking their own goose. They can't keep fighting yesterday's battle - it will be a Pyrrhic victory. When Win8 releases with full ARM support and PC laptop manufacturers (likely following Apple putting an A6 or A7 into a MacBook Air) put quad+ core ARMs into laptops with competent graphics platforms, Intel will wish they had been leading instead of following.

Re:GPU Performance (2)

oakgrove (845019) | more than 2 years ago | (#39276287)

It depends. Some of the most popular games in the world think World of Warcraft, Everquest, etc. run well on good enough integrated graphics a lot of laptops come with. The AMD stuff does work better than the intel stuff though by a long shot.

Re:GPU Performance (2)

Ambassador Kosh (18352) | more than 2 years ago | (#39278231)

The AMD llano chips are better then just competent for MMO games. A laptop llano chip will run EQ1, EQ2, WoW, The Old Republic etc without any discrete GPU.
They even handle things like Fallout 3, Fallout New Vegas + lots of mods without needing a discrete card.

The llano chips also do GPGPU without crushing your battery power. So if you are on battery power you can do calculations hundreds of times faster then an intel chip can if you can do GPGPU and not kill your battery doing it.

For me I run into more and more things where the CPU is good enough but more GPU power would be better. For me llano hit a great sweet spot for a laptop.

GPU performance always wins (1)

Alwin Henseler (640539) | more than 2 years ago | (#39276109)

If you do office type work only on a machine, then up to, say ~20% performance differences are pretty much irrelevant CPU-wise, as long as you're in a certain performance class. At least as important is harddisk performance, and having enough RAM to do jobs properly.

With any more-than-casual gaming, chances are GPU performance is much, much more important than CPU. So a small CPU advantage would still lose if the GPU is weak. Wouldn't know about their latest, but historically performance of Intel integrated GPU's has been pathetic compared to Nvidia or ATI (AMD) counterparts.

Re:GPU performance always wins (4, Interesting)

UnknowingFool (672806) | more than 2 years ago | (#39276241)

True but integrated is getting better. At this point the budget nVidia and AMD discrete cards are slightly better than Intel but IMO not worth the $50 for the slight upgrade. You are better off spending a little more and moving towards mid-range for a lot more performance.

More GPGPU offload (1)

codecore (395864) | more than 2 years ago | (#39276465)

There will be more off-loading of specific types of tasks as GPGPU becomes more mainstream. One of the things that will be driving wider adoption of GPGPU is better support in tooling. The MS supported extension to C++ called AMP should enable many more developers to take advantange of DX11 (or better) hardware for non-graphics work. It looks like support for AMP will be in Dev 11. If Ivy Bridge has DX11 support with more than a few GPU cores, then Ivy Bridge users should bennefit.

Re:More GPGPU offload (1)

UnknowingFool (672806) | more than 2 years ago | (#39277167)

Intel announced last year that Ivy Bridge would have DX11 support.

Re:GPU performance always wins (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39276471)

The great thing about whatever card is currently going for $50 nominal, $20-$30 if the rebate gods smile upon you, is the video outputs:

Back in the bad old days, buying a bottom-barrel graphics card meant getting a single VGA out(and possibly one where the manufacturer had cheaped out so hard that analog quality issues were visible...), and lousy performance, and the PCI ones that you needed to run more than one, once your AGP slot filled, were always mysteriously overpriced(alas, this still seems to be the case with 16x vs. 1x PCIe versions...)

Now, even a cheap POS, low-profile, fanless, last-gen whatever card has at least a DVI and a VGA connector, possibly even an HDMI if you splurge an extra $10 for the lousiest 'media PC' part currently on sale. Plus, computers with enough PCIe slots to run more than one are downright common, so you can get some serious pixel area for dirt cheap...

Re:GPU performance always wins (1)

Anonymous Coward | more than 2 years ago | (#39276901)

I bought a low end Fusion chip for my HTPC. The motherboard supports HDMI, DVI, VGA and a Displayport.

Re:GPU performance always wins (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39278045)

Interesting... How many simultaneously? I'm not a huge interface snob, I just need lots and lots of them...

Ivy bridge vs ARM (1)

DeadDecoy (877617) | more than 2 years ago | (#39276293)

How does ivy bridge compare to ARM? From what I've read, it appears that the ARM has lower wattage, but I'm not sure about the performance.

Re:Ivy bridge vs ARM (1)

zigfreed (1441541) | more than 2 years ago | (#39276395)

ivy bridge compare to ARM

Arm is in the same neighborhood as the Atom N270 and Z530 [phoronix.com] . Cortex A11 is the next revision.

Re:Ivy bridge vs ARM (4, Insightful)

armanox (826486) | more than 2 years ago | (#39276553)

As soon as ARM tries to catch up to the performance of x86 (and x64) it no longer has the lower power consumption.

Llano (2)

zigfreed (1441541) | more than 2 years ago | (#39276297)

The CPU in Llano is 2 generations back... with Athlon II. Beating the pants off Bulldozer is easy for Intel: just find a benchmark optimized for single threads, compiled with ICC, or weights the single threaded result. One of the major new features, the random number generator, wasn't even tested. Monte Carlo benchmarks, where are you? [nist.gov]

Like all tradeoffs (1)

Sloppy (14984) | more than 2 years ago | (#39276721)

The graphics, however, are still slower than AMD's Llano (but the Ivy Bridge CPU beats the pants off of the Fusion's). Is the tradeoff worth it?

It depends on what you're doing with it! Duh... Seriously, that was a deeply stupid question.

Re:Like all tradeoffs (1)

Nimey (114278) | more than 2 years ago | (#39277285)

It's a leading question in a Slashdot summary. It's hardly meant to be intelligent; I think the purpose is to drive discussion.

You see that somewhat often on news stories elsewhere, probably more at lower-quality establishments whose MO is to drum up controversy.

Yet another nail in AMD's coffin (1)

Frank T. Lofaro Jr. (142215) | more than 2 years ago | (#39276931)

1. AMD CPU bug
2. AMD divesting from its fab
3. Intel pulling even MORE ahead on performance and even lowering power usage at the same time!

Not to mention AMD's financial troubles and the fact they have a tendency to burn up.

Re:Yet another nail in AMD's coffin (1)

UnknowingFool (672806) | more than 2 years ago | (#39277075)

Plus Intel focusing on ultrabooks which is helped by better integrated graphics. I don't think AMD has an answer to that.

Re:Yet another nail in AMD's coffin (0)

Anonymous Coward | more than 2 years ago | (#39278371)

In case you didn't know, there are several minor bugs in every CPU. Like all of them, the latest AMD bug is very unlikely to occur in practice.

New CPUs made for laptops in mind? (1)

Dr. Spork (142693) | more than 2 years ago | (#39277311)

I'm a bit confused by the target market for these improvements. If you're buying one of these fancy chips for a desktop, you must have some reason to need all that power, and 90% of the people who have such a reason will also need their computer to have a discrete graphics card. If all you're doing in Facebook and photos, a cheap Core2 duo is more than you need. If you're gaming, you still can't do it without a discrete card. So now we hear that the only thing that really got improved in this generation is the integrated graphics, the one feature that just about no $250+ CPU buyer uses anyway. It would be one thing if the graphics portion could be repurposed to run some vector commands which the CPU could offload - but nothing like that appears to be in the works. So it seems to me that the graphics is just a waste of die space. I'd much rather see that used for CPU cache. Seriously, how hard would this be for Intel to do? It's not like a mask with extra cache instead of GPU would be all that hard for them to design!

I'll File this under "Who Cares?" (1)

DarthVain (724186) | more than 2 years ago | (#39277435)

Soooo, you built a CPU that barely runs faster than the previous generation CPU. However the integrated graphics are 20-50% better.

Integrated graphics for anything other than the most basic tasks are horrible by several degrees of magnitude. You can buy a 130$ discrete video card that will deliver 1000% time graphics.

In real world terms this is like taking a game that runs at say 12FPS and making it run at 14-18 FPS which is still unplayable. More realistically you will take a game that is completely unplayable because the graphics won't even run, to making it barely run, but still being unplayable.

The 30W less of heat is more interesting, may make for a better overclock or at least safer.

I guess this does advance the bar for integrated graphics, which academically is good. Realistically however no one will care for years, as the people that upgrade to the newest hardware are also the same people NOT using integrated graphics at all.

Good I suppose for that laptop market, though I didn't see a distinction here between desktop and mobile versions of the chip, so I assume they tested the desktop version.

As for the AMD quip, quit being stupid. So you buy a CPU solution that has slightly better integrated GPU but gets its ass handed to it in CPU? Makes sense.

Re:I'll File this under "Who Cares?" (0)

Anonymous Coward | more than 2 years ago | (#39277679)

You call >30 FPS on Skyrim, Crysis and Starcraft 2 "unplayable" or "horrible"? And the difference between 20 and 30 FPS is quite important.

It is simple... (1)

ak3ldama (554026) | more than 2 years ago | (#39279041)

The question is do you want to game, and at what price... Because for excellent prices you can buy quad core AMD Llano laptops with all the CPU performance you really need and the GPU performance to play a lot of games. Why would you want to go the Intel route, pay more, and get less gaming performance?

Plea for OpenCL template support ! (-1)

Anonymous Coward | more than 2 years ago | (#39279241)

OpenCL has no template support. What does this mean? For me it means throwing away all of our dimension independent code built using templates and going back to the repeat yourself hell. So no thanks. Its our main reason, but we have a couple more for sticking with CUDA, and they are mostly ATI's fault:

When GPGPU started, nvidia was the only vendor. Why? Well they were first. Still, although ATI added GPGPU computing capabilities pretty fast, they kind of discovered a funny fact: scientific people programm scientific software in linux computers/clusters. Not seeing the fun yet? A new multimillion dollar market appears and ATI gets locked out because their linux drivers suck. So now everyone using GPGPUs for number crunching is stuck on nvidia hardware, using either CUDA or OpenCL.

So now OpenCL is a standard and people using it ask me "why dont you switch?". Well,
- my codebase in CUDA is large,
- optimizing GPGPU code is not simple,
- CUDA tools (compiler, debugger, profiler...) are way better.
- we use templates (read above).
- all our GPGPUs are from nvidia, and without competition, they are going to be from nvidia in the forseable future.

Main problem: by using CUDA we are locking ourselves out of exploiting these heterogeneous cores. If you compare them with nvidias number-crunching capabilities these processors are like small tricicles for babies and nvidias Fermi cards are Concords. Still we do worry: right now we have in our code: MPI, OpenMP, CUDA, and if OpenCL ever supports templates we'll have to add OpenCL to the mix. Problem?

Re:Plea for OpenCL template support ! (0)

Anonymous Coward | more than 2 years ago | (#39280651)

So how exactly are you using these templates? Could you achieve the same by using a preprocessor?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>