Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Haswell Integrated Graphics Promise 2-3X Performance Boost

timothy posted about a year ago | from the more-better-cheaper-faster dept.

Graphics 133

crookedvulture writes "Intel has revealed fresh details about the integrated graphics in upcoming Haswell processors. The fastest variants of the built-in GPU will be known as Iris and Iris Pro graphics, with the latter boasting embedded DRAM. Unlike Ivy Bridge, which reserves its fastest GPU implementations for mobile parts, the Haswell family will include R-series desktop chips with the full-fat GPU. These processors are likely bound for all-in-one systems, and they'll purportedly offer close to three times the graphics performance of their predecessors. Intel says notebook users can look forward to a smaller 2X boost, while 15-17W ultrabook CPUs benefit from an increase closer to 1.5X. Haswell's integrated graphics has other perks aside from better performance, including faster Quick Sync video transcoding, MJPEG acceleration, and support for 4K resolutions. The new IGP will support DirectX 11.1, OpenGL 4.0, and OpenCL 1.2, as well." Note: Same story, different words, at Extreme Tech and Hot Hardware.

cancel ×

133 comments

IRIS you say? (1)

Anonymous Coward | about a year ago | (#43609491)

http://en.wikipedia.org/wiki/SGI_IRIS

Re:IRIS you say? (0)

Anonymous Coward | about a year ago | (#43609513)

http://www.sgistuff.net/hardware/graphics/iris.html

Re:IRIS you say? (1)

armanox (826486) | about a year ago | (#43610533)

My first thought as well.

Best thing about this (5, Informative)

earlzdotnet (2788729) | about a year ago | (#43609515)

is that Intel provides very nice open source drivers for their integrated GPUs

Worst thing about this (0)

Anonymous Coward | about a year ago | (#43609739)

Discrete graphics still significantly outrun Intel's offerings. We get a 150% performance increase when a 900% performance increase is warranted to compete with current cards (GeForce 680MX). Guess I'll never get integrated graphics if I can avoid it.

Re:Worst thing about this (4, Insightful)

MozeeToby (1163751) | about a year ago | (#43609795)

Think of these chips with integrated graphics like hybrid cars. You're not gonna go down to the drag strip with them, or haul a camper, or pick up the 10 kid carpool group. But for the vast majority of trips you'll get to the same destination is basically the same amount of time, with less noise and higher efficiency.

Re:Worst thing about this (0)

jedidiah (1196) | about a year ago | (#43609885)

Except your analogy is total nonsense. Modern operating systems do in fact benefit from having a decent discrete GPU. Performance improves in ways you might not even expect.

Then there are the things that you would expect to be better with a good GPU. All of these things improve with a GPU that doesn't suck. You don't have to be some obsessive gamer to be in a position to see the benefit either.

Historically, Intel GPUs have sucked so bad that no one wanted to support them at all.

Or to put it in automotive terms: even a family car needs the ability to get onto the highway without being a threat to public safety.

Re:Worst thing about this (4, Insightful)

h4rr4r (612664) | about a year ago | (#43610087)

So unexpected that you can't even name one!

Intel GPUs are fine for 99% of use. Heck, most games run fine on them. Sure the latest Call of Honor: Medal of Duty will not run on UItra, but most games will be fine on medium and low settings.

Re:Worst thing about this (1)

kimvette (919543) | about a year ago | (#43610605)

No benefits to a discrete GPU?

Ripping/Transcoding (thank you CUDA and OpenCL)
Running games at resolutions and detail levels that look better than doom
Image processing (thank you CUDA and OpenCL)
Video decoding/decompression (thank you CUDA and OpenCL)

Intel is doing remarkably with with HD4000, and OpenCL performs pretty well, but that won't come close to matching the 48 to 3072 GPU cores present in modern discrete video cards.

Re:Worst thing about this (2)

LordLimecat (1103839) | about a year ago | (#43610659)

IIRC transcoding / decoding with Intel's quicksync is actually VERY competitive with a discrete GPU. And all of those CUDA / OpenCL tasks are hardly representative of the average user.

As parent said, for the 99% use case, Intel integrated are sufficient.

Re:Worst thing about this (1)

Creepy (93888) | about a year ago | (#43610847)

They probably use the SIMD instruction set (aka SSE, formerly MMX) for parallelization. Honestly, I'd rather write to a GPU using a common language, but parallel technology does exist on the CPU. Inflexible, single purpose parallel instructions, but ones that would work for that task.

The average user (2, Interesting)

tepples (727027) | about a year ago | (#43611057)

And all of those CUDA / OpenCL tasks are hardly representative of the average user.

There's a meme lately in Slashdot comment sections that everything must be made for "the average user" without any room to grow. I see it popping up whenever anybody mentions limits of a product sold to the public, especially artificial market-segmenting limits. Where did it come from?

Re:The average user (2, Insightful)

Anonymous Coward | about a year ago | (#43611443)

And there is a trend of you posting stupid shit like this. Why would Intel want to release an expensive, fast, power hungry integrated GPU as part of their mainstream offerings? There are already vendors that sell expensive, fast, power hungry GPUs. Buy one of those.

Re:The average user (4, Insightful)

LordLimecat (1103839) | about a year ago | (#43611509)

The discussion was explicitly on whether what Intel is doing here is useful. For the majority of the market, the answer is yes.

There are areas where OpenCL and Cuda are useful. That is in no way relevant to the discussion. Noone is saying that powerful hardware is useful, we're saying that most users have no need for it and that integrated graphics are actually useful to actual people.

Re:The average user (2)

craigminah (1885846) | about a year ago | (#43612321)

The target is most likely HTPCs or something basic. If Intel's integrated graphics aren't adequate then pop in a real GPU and bypass the integrated graphics. Simple, and both sides are satisfied. If Intel integrated a full fledged GPU into their CPU then it'd increase die size, increase power consumption, increase heat generation, etc. A small, efficient, powerful-enough integrated solution is the best market for the vast majority of users since they still can opt to add a dedicated GPU if needed.

Re:The average user (1)

tepples (727027) | about a year ago | (#43612519)

A small, efficient, powerful-enough integrated solution is the best market for the vast majority of users since they still can opt to add a dedicated GPU if needed.

In the case of a desktop, I agree with you. But in the case of a laptop, people have to plan several years in advance because you can't replace the GPU without replacing everything else.

Re:The average user (1)

lgw (121541) | about a year ago | (#43612757)

Do you really see a market for a portable computer with a discrete graphics card several years from now? I expect the future will be tablets-with-keyboards distinct from non-portable gaming systems with better graphics.

Re: The average user (1)

UnknowingFool (672806) | about a year ago | (#43614365)

If you are using your laptop for hardcore gaming or bitcoin mining, I'm pretty sure you are going to spec it out correctly. Checking Facebook and checking out cat videos on YouTube? I don't think that requires detailed multi-year planning.

Re:The average user (2)

hairyfeet (841228) | about a year ago | (#43613155)

Because the average user is a huge market whereas those using niche applications for corner cases are naturally...well niche use cases?

While i would love nothing more than the average user that comes into the shop to need some hexacore or octocore with a monster GPU, as i just love playing with the latest tech, in reality the vast majority that walk through my door would be hard pressed to stress out that entry level Athlon triple with a low end HD5450 that I sell as a base model. Because of this naturally i sell a hell of a lot more base models than i do monster rigs so more of my business is gonna be focused on those customers since they bring in the most sales.

Its really just good business sense which is why I applauded when AMD stopped from making the high end GPUs first and then figuring out how to cut them down but instead went to make mainstream chips that they can then double for the high end market as there just isn't enough buying the really top o' the line stuff to make them the central focus, as much as we PC gamers would wish it were different.

Re:Worst thing about this (2)

fazig (2909523) | about a year ago | (#43610693)

Of course, when it comes down to raw performance comparison for high performance applications integrated solutions won't be able to keep up with current discrete solutions.
But when you take other criteria into account like the volume that is used by discrete hardware in a casing, the energy consumption and subsequently the need of a cooling systems, then integrated solutions become a lot more attractive.

Re:Worst thing about this (2)

hairyfeet (841228) | about a year ago | (#43613019)

Haven't looked at the latest cards, have you? One of the reasons that AMD went with GCN over their previous VLIW design is that GCN allows for core parking, with core parking when you are doing non GPU intensive jobs the GPU can put to sleep all the cores you no longer need thus bringing the wattage and heat WAY down, down almost to IGP levels without having to be stuck with only IGP levels of performance.

And as far as noise goes most of the cards have a passively cooled variant, can't get more quiet than that and again without giving up the huge performance advantage the discrete cards give you.

Run compute intensive tasks remotely (2)

tepples (727027) | about a year ago | (#43610979)

thank you CUDA and OpenCL

OpenCL-heavy tasks can be done on a compute server at home or in a data center, and you can SSH (or VNC or RDP or whatever) to use an application on a compute server from your laptop. The only real use case I see for carrying an OpenCL powerhouse with you, apart from running shaders in a high-detail 3D game, is for editing huge images or high-definition video in a vehicle or some other place with no Wi-Fi. One workaround is to downscale the video to low definition (e.g. 320x180), edit the low-definition video while away from the net, and then export the edit decision list (EDL) back to the compute server to render the result in high definition. I used to do that with AviSynth.

Running games at resolutions and detail levels that look better than doom

Games are the other reason for carrying a beefy GPU with you. But Skyrim looks better than Doom, Doom II, and Doom 3, and Skyrim runs playably on the HD 4000 at 720p medium [anandtech.com] .

Re:Run compute intensive tasks remotely (1)

sexconker (1179573) | about a year ago | (#43611637)

thank you CUDA and OpenCL

OpenCL-heavy tasks can be done on a compute server at home or in a data center, and you can SSH (or VNC or RDP or whatever) to use an application on a compute server from your laptop. The only real use case I see for carrying an OpenCL powerhouse with you, apart from running shaders in a high-detail 3D game, is for editing huge images or high-definition video in a vehicle or some other place with no Wi-Fi. One workaround is to downscale the video to low definition (e.g. 320x180), edit the low-definition video while away from the net, and then export the edit decision list (EDL) back to the compute server to render the result in high definition. I used to do that with AviSynth.

Running games at resolutions and detail levels that look better than doom

Games are the other reason for carrying a beefy GPU with you. But Skyrim looks better than Doom, Doom II, and Doom 3, and Skyrim runs playably on the HD 4000 at 720p medium [anandtech.com] .

There's a world of difference between having compute power on your machine and on a machine you have remote access to.
And there's a world of difference between running a game at 20-30 fps on medium at a sub-native resolution and running it as intended.

Re:Run compute intensive tasks remotely (1)

tepples (727027) | about a year ago | (#43612099)

There's a world of difference between having compute power on your machine and on a machine you have remote access to.

What's the practical effect of this "world of difference"? I need some ammo against the oft-repeated argument that "Apple's App Store restrictions are irrelevant because the iPad can run SSH and VNC".

And there's a world of difference between running a game at 20-30 fps

The article I linked says 46 fps.

on medium

Does the PS3 version even go higher than medium?

at a sub-native resolution

The article I linked says 1366x768. (I rounded it to 720p for the reader's convenience.) How is this "sub-native" on a laptop with a 1366x768 panel?

Re:Worst thing about this (2, Interesting)

Anonymous Coward | about a year ago | (#43611879)

I do HPC engineering for a living, and I really don't see the point in private discrete GPUs anymore. We've added 8000 Teslas to our cluster, and I've come to prefer using them over CPUs simply for performance reasons. But likewise I've come to prefer IGPs to discrete cards for private use in the last 2-3 years. There is no game that isn't playable on an Ivy Bridge IGP (last I've run is Skyrim on 1920p and settings in the middle between average and max, 40-50 FPS), the power usage is lower (in general but also when watching movies), complexity is tremendously lower (which translates into more stable drivers), it's cheaper and fwiw lighter.
Sure, if you're a CAD guy or 3D artist, you'll be able to to bigger stuff faster, but apart from those highly specific needs by a tiny part of the population (who can and should buy "specialised hardware", that means discrete GPUs), there is little advantage over IGPs.

You mention transcoding media, hardly a common task, but what's the problem with one core chugging away at it in the background while you use your other system resources? A GPU may do it quicker, but either you can't use it to it's full capacity for transcoding while at the same time playing that AAA title, or you're not needing the resources at the moment anyway, so "quicker" becomes nothing but a luxury.
VIdeo decompression is a moot point, as 500 MHz ARM chips can do full HD via DSP decoders, and you've got plenty of the latter on modern x86 chips.

The only somewhat "valid" point is e-peen: you'll get higher benchmark scores and feel better for having high-end hardware, which is up to everyone personally. I like the feeling of my tiny little "thingie" reaching the same goals more efficiently, with lower power usage, thermal emission and complexity.

captcha: tearful

Re:Worst thing about this (1)

Pausanias (681077) | about a year ago | (#43612011)

Perhaps this will convince NVIDIA not to underclock their GPUs. Now that the baseline is much higher, they will have to deliver awesome performance to be relevant in the notebook scene.

Re:Worst thing about this (1)

D1G1T (1136467) | about a year ago | (#43612293)

er, no. While I agree these chips are great for most uses (I have them in 2 of my 3 primary computers at home), if you want to game at all, you need something more. Even low-requirements WoW is basically unplayable on the latest 4000 series chip on medium settings even if combined with a quad core i7.

Re:Worst thing about this (1)

hairyfeet (841228) | about a year ago | (#43612909)

I can, try a hardware accelerated desktop in Win 7 or 8 and then try the same without the acceleration, with will be MUCH more responsive and snappy. Also by taking some of that load off the CPU you leave more cycles for the jobs that the user cares about, another benefit.

Re:Worst thing about this (2, Informative)

Anonymous Coward | about a year ago | (#43610223)

Performance improves in ways you might not even expect.

I guess it improves it in such an unexpected way I didn't even notice. In the two desktop computers I have in my household, I originally built them using integrated video cards and then upgraded to discrete cards couple months later when I had some more time to pick out something and some spare cash. Neither my wife nor I noticed any difference in normal desktop performance for office related software or web browsing. The only difference seen was in video games. My work issued laptop has integrated GPU, and I'm not sure what performance could be improved on it other than maybe a faster hard drive (there is only so fast I can type, click through documents, and my code runs on a cluster instead of locally...).

Or to put it in automotive terms: even a family car needs the ability to get onto the highway without being a threat to public safety.

Yes, while this is true, it really isn't relevant. You look foolish if you try to claim hybrids are not usable because they can't drive at highway speeds as they are already capable of that. Likewise, integrated GPUs are quite fast enough for a the vast majority of basic computer use and only a hindrance to specific uses that many people may or may not need (i.e. not every one does high end gaming, video editing, etc.).

Re:Worst thing about this (3, Informative)

fast turtle (1118037) | about a year ago | (#43610999)

actually, video editing doesn't really benefit from a discrete GPU since the damn encoding support is still crap. Most of the various software I've looked at still get more bang from a better CPU then GPU encoding and if you're in the industry like ILM/Pixar, then you aint using GPU encoding anyhow - its mainly dedicated ASICS and such. For someone doing it as a hobby, they're buying a video card specifically supported by their software so it makes no god damn difference to 99.99 percent of the folks out there that onboard graphics suck.

In my case, small business owner; I've been planning a new build for 4th quarter (part of my 4yr replacement cycle) based on a Xeon E3 1275 with onboard graphics because the system purpose doesn't need much in the way of a GPU. It's a development system (builds and Database work) so why waste money. Hell all of my employees systems are onboard graphics just to save a few bucks that's better spent on more ram or a slightly better cpu. As with anyone, trade-offs are required when building/specing our systems and as a business, we tend to go with the cheapest configurations we can get. Keep in mind that the cheapest configuration does not mean the cheapest parts. We learned a long time ago that spending a bit more for quality hardware resulted in less downtime.

Room to grow is another factor (1)

tepples (727027) | about a year ago | (#43611159)

In the two desktop computers I have in my household, I originally built them using integrated video cards and then upgraded to discrete cards [...] The only difference seen was in video games. My work issued laptop has integrated GPU, and I'm not sure what performance could be improved on it

That's because your work laptop's work load probably doesn't have any 3D. If your job description included CAD or other 3D modeling, you might notice more of a difference with a beefier GPU.

not every one does high end gaming, video editing

Room to grow is another factor. Consider someone who buys a computer and then decides he wants to do non-Flash gaming or edit high-1080p video from his smartphone's camera. Suddenly his laptop has become obsolete. Is he necessarily going to have the money to buy a whole new laptop?

Re:Room to grow is another factor (0)

Anonymous Coward | about a year ago | (#43611445)

That's because your work laptop's work load probably doesn't have any 3D. If your job description included CAD or other 3D modeling, you might notice more of a difference with a beefier GPU.

Of course, but that seems irrelevant to the previous post that was countering the idea that basic computer and OS use would benefit in unexpected ways from a discrete card. If you are going to be doing something that is 3D intensive, or is making the transition to GPGPU, then of course there will be improvements by investing more in a higher end video card. The whole point of the post was that for more basic things though, it isn't needed and there is not some mysterious boost that an integrated GPU falls short on.

Room to grow is another factor.

Actually, for a desktop, this was the whole reason I went with an integrated GPU. It cost an extra $10 to get a CPU with it instead of one without it. For a desktop at least, I still have the option of getting a discrete card down the line if I need it, and then I can choose exactly what I need at that point. If I had instead gone for a more modest discrete GPU now that I don't have much need for at the moment, it would be wasted if down the line I found out I needed something with more power.

Re:Room to grow is another factor (0)

Anonymous Coward | about a year ago | (#43611505)

Oh gee, I spent $1900 on this laptop and can't get more than 3 hours of battery life out of it. Guess I'll buy some stupid FPSes. Said nobody ever.

Re:Worst thing about this (2)

realityimpaired (1668397) | about a year ago | (#43610235)

Except your analogy is total nonsense. Modern operating systems do in fact benefit from having a decent discrete GPU. Performance improves in ways you might not even expect.

Nope. They benefit from having 3D acceleration, yes, but the integrated graphics on modern Intel chips is a discrete core. It just happens to be on the same die as the CPU. My laptop's CPU is clocked at 1.2GHz dual core, with two extra cores for the video clocked at 300-500MHz. Those cores are dedicated to the video only, and it's *plenty* fast enough for normal use on the operating system, with all of the blingy effects. Switching to a discrete graphics card won't make any difference at all, because the video cores are physically separated from the CPU cores, and have a completely different execution pipeline so wouldn't be able to run OS calls anyway.

Intel's integrated video is enough to run most games these days. I game on that laptop and while the graphics aren't as fast as my desktop's 6970, they're plenty adequate for gaming on the go... and I'm not just talking about ancient games here, either: it's good enough for Civ5 in WINE, and also for stuff like Torchlight II, which it'll run at max on the laptop's 1366x768 screen. You won't be playing the latest Call of Duty at maximum settings on a dual 1920x1080 display with Intel's integrated graphics, but then again, the people who want to do that won't be bothering with integrated graphics to begin with, will they? A significant portion of Steam's userbase are using Intel integrated graphics these days....

Re:Worst thing about this (1)

hairyfeet (841228) | about a year ago | (#43612881)

Yeah but last I checked even a lousy $30 discrete just curbstomps them and likewise the AMD APUs beat them pretty badly in the GPU dept.

So unless they pulled a rabbit out of their hat I'll be telling folks the same thing I have always said about Intel IGPs, which is good for office units and not much else. I mean sure for spreadsheets and basic video they are fine, but IMHO you are better off getting even a low end discrete over using the Intel IGP.

Re:Worst thing about this (1)

DavidClarkeHR (2769805) | about a year ago | (#43609869)

Discrete graphics still significantly outrun Intel's offerings. We get a 150% performance increase when a 900% performance increase is warranted to compete with current cards (GeForce 680MX). Guess I'll never get integrated graphics if I can avoid it.

Intel isn't competing with discrete graphics solutions, though.

And yes, the speed increase isn't spectacular when compared to the other options in the marketplace, but they're not exactly "alternatives", and they're certainly not "competing".

Re:Worst thing about this (1)

PhrstBrn (751463) | about a year ago | (#43610025)

You mean a $300 part that is a CPU and GPU combined has a slower GPU component that a $500 dedicated GPU? Shocking. Utterly shocking.

Re:Worst thing about this (1)

Creepy (93888) | about a year ago | (#43611115)

Only they are failing to compete with $20 cards as well, and their best offering is smoked by SOC (sorry, lingo - System On a Chip) like the AMD A10 in the graphics department.

Re:Worst thing about this (1)

Guspaz (556486) | about a year ago | (#43611305)

The Iris Pro has similar performance to a GeForce GTX 650. Please let me know where you can get a GTX 650 for under $20.

You'd have to go back more than one generation to find an Intel iGPU that is slower than a $20 discrete card.

Re:Worst thing about this (1)

JDG1980 (2438906) | about a year ago | (#43613287)

Only they are failing to compete with $20 cards as well, and their best offering is smoked by SOC (sorry, lingo - System On a Chip) like the AMD A10 in the graphics department.

The AMD A10 is not a "SoC" any more than Intel's offerings are. Both AMD and Intel are currently offering CPUs with integrated GPUs – it's just that the marketing is slightly different, and that each company emphasizes its own strength (AMD has better GPUs, Intel has better CPUs).

For these to reasonably be considered a "SoC", the whole northbridge (or whatever they're calling it now) would have to be moved onto the main die. That hasn't happened yet, though Intel is starting to move that way with Haswell.

Re:Worst thing about this (1)

Guspaz (556486) | about a year ago | (#43611187)

The 680 isn't mainstream, by any means. Haswell brings the higher-end iGPU up to the performance levels of a GeForce GTX 650, which definitely is more mainstream.

Re:Best thing about this (0)

jbernardo (1014507) | about a year ago | (#43609889)

Tell that to Poulsbo chipset buyers...

The promised Gallium3D driver [phoronix.com] was never delivered, even if it apparently was nearly ready to release, and Intel only kicked users around between their desktop and automotive teams, releasing some crappy binary drivers to keep them quiet.

Re:Best thing about this (2)

jonwil (467024) | about a year ago | (#43609993)

I suspect a lot of the problem there has to do with Imagination Technologies (creator of the PowerVR GPU core in the Poulsbo parts) and how much Imagination Technologies were willing to let Intel release (either as binary drivers or as source code)

Re:Best thing about this (0)

jbernardo (1014507) | about a year ago | (#43610089)

I suspect a lot of the problem there has to do with Imagination Technologies (creator of the PowerVR GPU core in the Poulsbo parts) and how much Imagination Technologies were willing to let Intel release (either as binary drivers or as source code)

And you're probably right, in what respects to not releasing a open source driver for the poulsbo chipset. But Intel treated their customers with the utmost disrespect, only pretending to be working on a usable driver until it was discontinued and abandoned. No decent closed source driver was released after the first interaction, before the promise of the Gallium 3D driver. Users were lied to, and pushed from one team to the other looking for drivers. That wasn't Imagination Tech - that was Intel stalling and lying to keep the noise down.

Nothing assures us that it won't happen again.

Re:Best thing about this (1)

dpokorny (241008) | about a year ago | (#43610457)

Intel still supports and releases binary drivers for the Poulsbo platform.

http://www.intel.com/content/www/us/en/intelligent-systems/intel-embedded-media-and-graphics-driver/emgd-for-intel-atom-systems.html [intel.com]

It has never been a one-shot or abandoned platform. However, you need to understand that this device was specifically designed for embedded applications and not general purpose computing products.

It's unfortunate in my mind that several manufacturers released it as a consumer product.

Re:Best thing about this (3, Informative)

h4rr4r (612664) | about a year ago | (#43609999)

Not an intel chip, it came from PowerVR.

Anyone who bought one of those was simply a fool. Everyone gets to be a fool now and then so don't feel to bad if it bit you.

Re:Best thing about this (0)

jbernardo (1014507) | about a year ago | (#43610171)

Anyone who bought one of those was simply a fool. Everyone gets to be a fool now and then so don't feel to bad if it bit you.

Intel fooled me once. They won't fool me twice, as I won't buy their chipsets again. My neybooks are AMD APUs, my tablets and smartphones are ARM. Good ridance!

Re:Best thing about this (1)

h4rr4r (612664) | about a year ago | (#43610263)

When AMD makes a decent open source driver I will use their graphics cards. So far I use intel GPUs and Nvidia ones. The latter will end up no longer in my home when Intel gets good enough. At that point I will likely stop buying AMD cpus.

AMD's Open Source Linux Driver Trounces NVIDIA's (1)

tepples (727027) | about a year ago | (#43611199)

When AMD makes a decent open source driver I will use their graphics cards. So far I use intel GPUs and Nvidia ones.

You're in luck. AMD's open source driver trounces NVIDIA's [slashdot.org] .

Re:Best thing about this (1)

higuita (129722) | about a year ago | (#43611259)

So you are using open Intel drivers, that are more or less on par with AMD ones (usually lagging a few months from intel, because they share most of the mesa code, its easier to catch up than building ground up), with AMD cards being more powerful and so, faster.

I play linux games (via humble bundle, steam and desura) with AMD open drivers, so i can tell you that they work. If you check phoronix benchmarks, AMD closed and open drivers have a average performance difference of 20%, that isnt perfect, but not bad also...

You say are also using nvidia:
-if closed source drivers -> yes, they are faster, yes, they are closed source shit from a company that don't really care about linux, just their money (As Linus said: F*ck you Nvidia)
-if open source drivers -> they share the mesa part with all others, but lagging several years, to the point that aren't really usable for most people

So looks like you are a "little" bias!

Re:Best thing about this (1)

fast turtle (1118037) | about a year ago | (#43610785)

Bullshit if it's the PowerVR crap - no opensoruce drivers at all. Check it out before you make a blanket statement but I will agree for their HD series (which the Iris is simply an improvement of) they do offer a compelling reason to consider if going the Open Source route.

A major feature is still missing in action (1)

Fallout2man (689436) | about a year ago | (#43611015)

How long until we finally have Intel and other CPU vendors create a unified memory model now that we have a GPU on die? I mean if anything I'd think the point would be to have your on-die GPU integrate with a discrete card so that both low and high-end setups gain something from this. PS4 will have a unified memory model; how long until the rest of us do on our desktops?

supports Display Port 1.2 (2, Interesting)

Anonymous Coward | about a year ago | (#43609569)

the Hot Hardware link confirms DisplayPort 1.2, which is the only thing I /really/ care about. The others are nice, but 4K out of the laptop means my next mid-range laptop can be my primary desk machine as well. This should push along the display manufacturers after their decade of stalling (perhaps soon we'll see screens in the 20-24" range with more resolution than our 5" displays have).

No, probably not (1)

Sycraft-fu (314770) | about a year ago | (#43609983)

Making a 4k display isn't as simple as manufacturers just wanting it bad enough. I know people like to look at little phone screens and say "If these can be high rez, why can't big displays be higher rez!" but all that shows is a lack of understanding of the situation.

Transistors cost money and pixels require them. How many pixels a display has is not a small part of its cost. So you can't just say "Let's have 4k or 8k desktop displays, should be hard!" because it in fact is.

That isn't to say we won't, it is coming, but it will be some time.

Also there are plenty of other technologies that you need for higher rez like interconnect speed (as you note with DP 1.2), video memory, video processing power and so on. Display makers haven't been "stalling" it is something that hasn't been very doable. If you looked around, you could find that there were (and are) a few examples of higher rez displays but they were expensive and plagued with issues like low refresh rate and requiring multiple dual link DVI connections to run.

Seriously, people need to stop pretending like companies could just give us some awesome new technology if they wanted to bad enough. No, not really. There's a lot that goes in to something like a higher rez display. You don't just wish it into existence.

So chill. Higher DPI displays will happen at some point, probably sooner rather than later. However it isn't a situation of "stalling". It is a situation of R&D, production costs, and all the technology needing to exist.

If you want a 4k display right now, go get a Sony PVM-X300. It's a no-shit 4096 x 2160 30" display. Just don't bitch about the cost.

Size of market (1)

DrYak (748999) | about a year ago | (#43610385)

The technology is out there, witness the few korean dead-cheap dumb high-res screens.
They only cost more than similar lower, hd-res of the same featureless no-name brands.

What is lacking is a huge market, so economy of scales kicks in and produced ueber-high-resolution screens is worthy.
Currently the biggest chunk of all produced flat pannel end up in TV screens. It makes more sense economically for the constructor to just put the same pannels into computer screens, than to market a special different type of pannels with higher res targetting the small niche market of users who want a PC monitor *and* also want higher resolution.

At least we can count on Apple and their "retina" buzzword to make higher resolutions fashionable, and thus increase enough demand that the big brands will start noticing.
Meanwhile, try to get a no-name high-res IPS pannel imported from korea on ebay.

We're closer than it seems (2)

Overzeetop (214511) | about a year ago | (#43610465)

http://www.engadget.com/2013/04/12/seiki-50-inch-4k-1300/ [engadget.com]

$1300 for a 4k display. Granted, it's locked to 30Hz, but for most of us 60Hz will be as fast as we need to go (though we'll get more for that god-awful 3D crap they keep trying to push). 4k @ 50 is very close the 2560x1600 30" monitor I have for pixel size, which is fine enough for me at my working distance.

We stalled at 1920x1080 because every moved to TV production. Now that 4k/8k has broken free, we can get over that hump. Not saying there aren't hurdles, but the consumer-limit has been breached and I expect the next to years to result in a shift. Note: there is no material broadcast at 60 frames at 1080, but people are all bonkers over 240Hz displays anyway. They'll be the same ones who wanted 1080p devices to watch their (upscaled) DVDs.

Re:No, probably not (0)

Anonymous Coward | about a year ago | (#43610649)

I picked up an X300 for my office. It really is an incredible display, even at the nearly $30k price tag. Very very happy with it. I do look forward to being able to drive it from a notebook instead of a tower under my desk.

Re:No, probably not (2)

JDG1980 (2438906) | about a year ago | (#43613529)

Seiki has a 50" TV with a 3840x2160 resolution, available right now for $1499 [tigerdirect.com] . So I don't buy the argument that it's somehow technologically prohibitive. Why can this crappy company no one has ever heard of bring out a 4K TV under $1500, but no one else can make a 4K monitor in an even smaller size (32" or so) without charging over $5500? (and that's for Sharp's offering, which is the next least expensive – most 4K monitors cost as much as a new car). As far as I can tell, it's not technological barriers but a desire to segment the market and charge professional and medical users out the ass for 4K displays.

Re:supports Display Port 1.2 (2)

tlhIngan (30335) | about a year ago | (#43611035)

the Hot Hardware link confirms DisplayPort 1.2, which is the only thing I /really/ care about. The others are nice, but 4K out of the laptop means my next mid-range laptop can be my primary desk machine as well. This should push along the display manufacturers after their decade of stalling (perhaps soon we'll see screens in the 20-24" range with more resolution than our 5" displays have).

Know what else supports 4K? HDMI.

And I'm fairly sure that if you're willing to pay $10K for a 4K screen (the current cheapest on the market - some Chinese knockoff), display manufacturers are willing to give you the 4K display you want.

Or you could pay $1K and get high res 27-30" screens as well just as you always could. You won't be able to find a 4K screen for $100 until 4KTVs are down in the under-$1000 range like HDTVs are now.

The only reason display resolutions "stalled" was because everyone was attracted to cheap cheap cheap - cheap laptops, cheap LCD monitors, etc. You could always have paid more and gotten better - 2560x1440 offerings for 27"+ have been around for years. Of course, they cost more, and everyone says "why did you pay so much when you could've gotten one for $200!" (See the last Apple monitor /. article where people assumed it was a 1080p screen when it wasn't).

Ditto laptops - we aren't seeing revolutions because of ultrabooks - just that ultrabooks have managed to pull an Apple and command extra money, and thus give what laptops in that price range have always given - better specs. If you skipped past the $300 laptops on display and upped it to $1000+, you'd find laptops with discrete GPUs, screens higher than 1366x768, and other niceties.

What's happened is good products have price floors. Races to the bottom means cutting out stuff to save money so you get crappy products cheaper.

Re:supports Display Port 1.2 (0)

Anonymous Coward | about a year ago | (#43612133)

http://www.engadget.com/2013/04/12/seiki-50-inch-4k-1300/
$1300 != $10K

Re:supports Display Port 1.2 (2)

Kjella (173770) | about a year ago | (#43612705)

And I'm fairly sure that if you're willing to pay $10K for a 4K screen (the current cheapest on the market - some Chinese knockoff), display manufacturers are willing to give you the 4K display you want.

Try less than $5K for the Sharp PN-K321 [compsource.com] professional monitor.

Or you could pay $1K and get high res 27-30" screens as well just as you always could. You won't be able to find a 4K screen for $100 until 4KTVs are down in the under-$1000 range like HDTVs are now.

Is $1300 [shopnbc.com] close enough?

The only reason display resolutions "stalled" was because everyone was attracted to cheap cheap cheap - cheap laptops, cheap LCD monitors, etc.

LCDs have been growing and dropping steadily in price, I picked up a 60" LCD at about half the cost a 42" LCD cost me five years earlier. That's double the screen estate (60/42)^2 for considerably less, while resolution has been ridiculously expensive. 4K - as opposed to 2560x1600/1440 that never saw any adoption outside a few niche computer monitors has the potential to be the new HDTV. Right now you're paying early adopter premiums but I've read that it should really cost 1.5-2x of what a similar full hd screen costs, so if the TV momentum and volume gets behind it there'll soon be cheap 4K.

Re:supports Display Port 1.2 (1)

JDG1980 (2438906) | about a year ago | (#43613647)

Know what else supports 4K? HDMI.

The current HDMI revision only supports 4K at frame rates of 30 fps or below, so it's not really suitable for anything except watching film-sourced content. Supposedly HDMI 1.5 might support 4K@60Hz, but this is not confirmed. You need DisplayPort to do it now.

Leaked months ago (0)

Hadlock (143607) | about a year ago | (#43609581)

Last November it was revealed that the intel processors would have the GT1, GT2 and GT3 graphics in Haswell. The only difference is that Intel has lifted the muzzle of a press embargo on Haswell to push more Ivy Bridge (and yes, even Sandy Bridge) units out the door to clear out back logs.
 
It's been known since last year that the release date for Haswell is June 2nd, but nobody is allowed to report on that for fear of losing intel advertising dollars.

Re:Leaked months ago (0)

Anonymous Coward | about a year ago | (#43609763)

It's June 3rd and Intel has said so themselves.

Now intel users can play 10 year old games :D (1)

oic0 (1864384) | about a year ago | (#43609591)

They might even be able to turn up their graphics in WoW now :D

Re:Now intel users can play 10 year old games :D (1)

josephtd (817237) | about a year ago | (#43609643)

Exactly. It seems we hear this every time has an integrated graphics upgrade. By that I mean a chorus of marketing speak to make us believe their graphics offerings compete and surpass with lower end discrete GPUs. In reality, there is hype, hope and ultimately disappointment as the parts actually end up in the hands of users. It's getting old. Integrated GPUs are still great for office applications and basic OS effects rendering. The real benefit is cost savings to manufacturers and battery savings to users. It's a trade-off, nothing more. Not saying they don't have a place, but let's be realistic....

Re:Now intel users can play 10 year old games :D (1)

h4rr4r (612664) | about a year ago | (#43609841)

Honestly unless you are playing Crisis they are getting pretty good. I played Portal 2 on IGP. That is two years old, but the laptop I played on was already a year old at that point.

Re:Now intel users can play 10 year old games :D (1)

Shinobi (19308) | about a year ago | (#43610401)

Portal 2 runs on an engine that's effectively 12 years old by now, with just some updates. It's far more CPU dependant than more modern engines for example.

Same thing with Left 4 Dead 2, the benchmark of which Valve rigged by using a 1½ year newer update for the Linux version than what's available for the Windows version, an update that actually shifts more stuff to the GPU for example.

Re:Now intel users can play 10 year old games :D (1)

xynopsis (224788) | about a year ago | (#43610431)

Honestly unless you are playing Crisis they are getting pretty good. I played Portal 2 on IGP. That is two years old, but the laptop I played on was already a year old at that point.

New Intel IGPs does handle Crysis [youtube.com] with fluid framerates even with quality settings turned high.

Re:Now intel users can play 10 year old games :D (0)

Anonymous Coward | about a year ago | (#43610003)

That is not true. I have the 4000 paired with a nVidia chip. The 4000 handles 90% of my games (about 800 of them at last count) and at a MUCH lower power rate (less heat too). There are a few games it struggles with. For that I have a nVidia chip. Will it run crysis 2/3 acceptably? Prob not. But most games out there rarely push the GPU much these days. They are usually shovelware from consoles or aimed at the sweet spot in Steams metrics. 'indi' games are usually much less taxing and you prob could get away with an even older GPU from intel to run them.

For most people the 4000 works pretty well. For your 'enthusiast' (translation I run high res games and like blinky leds in my case) not so much. They want 300 fps in some game where 60-80 would be plenty.

I however do not expect much. 2x of what they have? Is still pathetic. The day where I go 'hmm do not need a separate one' will be a grand day indeed. Anyone who thinks the existing nvidia current GPU market will be around for this sort of thing long term has been ignoring moores law. nVidia knows it too. They have been trying to get into every console, mobile, whatever to make sure they keep shipping chips. The GPU will be part of the CPU within 3-6 years. It is good intel has decided to get their act together on it.

Notice how their litrature deliberatly does not compare themselves to AMD and nVidia?... They are still 'sucky' but they are hitting the 'good enough' spot. Which is a great place to be as you can move volume...

Re:Now intel users can play 10 year old games :D (0)

Anonymous Coward | about a year ago | (#43610715)

You should compare the Toms/Anand graphics benchmarks for the Radeon 7xxx series vs the 5xxx series released 3 years ago. Over that timespan, performance has gone up around 30% or so. Power consumption has dropped by something like half in addition.

During the same timeframe (ok, stretching a few months for Haswell), Intel will have released 3 generations of Core i CPUs, and graphics performance will have gone up by at least 4x. http://www.tomshardware.com/reviews/core-i7-4770k-haswell-performance,3461-6.html

So the takeaway is that this isn't business as usual. Integrated graphics is beginning to crowd out discrete graphics (granted, not the $150+ segment). However, spending less than $100 on a graphics card won't get you much more, if any, extra performance if you have an AMD A10 CPU.

Re:Now intel users can play 10 year old games :D (2)

alen (225700) | about a year ago | (#43609755)

yep

i love buying games the day of release with all the DRM, bugs, always connected to internet issues where they can't support all the players, etc

i'd rather buy a game a year or two after release after its on sale at least 50% off

Re:Now intel users can play 10 year old games :D (0)

Anonymous Coward | about a year ago | (#43609773)

Be sure to tell yourself that nonsense as you drop another $500 on a landfill-bound graphics card to play Xbox360 ports.

Re:Now intel users can play 10 year old games :D (1)

sexconker (1179573) | about a year ago | (#43611735)

Be sure to tell yourself that nonsense as you drop another $500 on a landfill-bound graphics card to play Xbox360 ports.

The other option being dropping $500 to $2000 on a landfill-bound laptop to play Xbox360 ports at worse settings and frame rates? Or dropping $1000 for the high-end, landfill-bound desktop CPU from Intel (since that's the one with the "high-end" integrated GPU) to do the same?

You haven't tried, have you? (2)

Sycraft-fu (314770) | about a year ago | (#43609879)

New Intel GPUs are surprisingly competent. No, they don't stand up to higher end discrete solutions but you can game on them no problem. You have to turn down the details and rez a bit in some of the more intense ones, but you can play pretty much any game on it. (http://www.notebookcheck.net/Intel-HD-Graphics-4000-Benchmarked.73567.0.html). For desktops I always recommend dropping $100ish to get a reasonable dedicated card but for laptops, gaming on an integrated chip is realistic if your expectations are likewise realistic.

Gone are the days of the GMA 950 that sucked at even simple GDI+ rendering. The integrated GPUs now are competent, though lower end (in keeping with their power profile).

Re:You haven't tried, have you? (1)

oic0 (1864384) | about a year ago | (#43609947)

I've got one of those hybrid laptops with both intel and Nvidia gpus in it. When the settings are screwed up and it forgets to switch to the Nvidia card, even simple games like league of legends run jerky and more modern stuff becomes completely unplayable unless you completely neuter the settings and resolution.

Re:You haven't tried, have you? (1)

Sycraft-fu (314770) | about a year ago | (#43610045)

In other words, precisely what I said. Yes, you have to back off on rez and settings. Guess what? That's fine, and expected for something as low power profile as an integrated GPU. Fact remains you can game on it just fine, even new games. Not everyone needs everything cranked, not everyone wants to spend that kind of money. Intel GPUs are extremely competent these days. They are low end and will always be because they get a fraction of the 30-40ish watts of power a laptop CPU/GPU combo can have rather than the 100+ watts a dedicated laptop or desktop GPU can have. However for all that, they do well.

Scaling back has a limit (1)

tepples (727027) | about a year ago | (#43611377)

Yes, you have to back off on rez and settings. Guess what? That's fine, and expected for something as low power profile as an integrated GPU. Fact remains you can game on it just fine, even new games

One could game just fine on an original PlayStation or a Nintendo DS, and new DS games were still coming out until perhaps a few months ago. It's just that the settings have to be scaled so far back that things look like paper models [wikipedia.org] of what they're supposed to be. The DS in particular has a limit of 6000 vertices (about 1500-2000 polygons) per scene unless a game enables the multipass mode that dramatically lowers frame rate and texture resolution. Fortunately, the HD 4000 in Ivy Bridge runs games with detail comparable to the PS3.

Re:You haven't tried, have you? (2, Informative)

Anonymous Coward | about a year ago | (#43610337)

I put together a tiny mini itx system with an Ivybridge i3-3225. The case is super tiny and does not have space for a dual slot video card. Even low-to-mid grade video cards are dual slot nowadays, and I didn't have any on hand that would fit.

I shruged, and decided to give it a go with the integrated HD4000. The motherboard had really good provision for using the integrated graphics anyway. Dual HDMI+DVI and even WiDi support via a built in intel wireless N card. (Widi only works via the integrated GPU, so that will be fun to play with if I ever get any widi displays)

This is the first time I've ever been impressed with integrated graphics. Hell, this is the first time I've ever considered it usable. Even for non-game stuff, I've always dissatisfied with intergrated GPUs when things get intense. Dual monitors, windows, streaming HD video, and anything that uses GPU acelleration could take the system to it's knees. With the HD4000 everything is butter smooth, even when I've got a 3dish indy game going (Line minecraft)

The system is tiny, whisper quiet (So quiet that that HD noise was agrivating and I switched to an SSD for noise alone), runs very cool, and boots in a blink from it's SSD. I could get used to this.

Re:You haven't tried, have you? (1)

serviscope_minor (664417) | about a year ago | (#43610371)

No, they don't stand up to higher end discrete solutions

They don't stand up to AMD's integrated graphics either.

Re:You haven't tried, have you? (1)

beelsebob (529313) | about a year ago | (#43611105)

Actually, given the benchmarks, the GT3e should be about 20% faster than the A10 5800k's graphics chip.

Re:Now intel users can play 10 year old games :D (0)

Anonymous Coward | about a year ago | (#43610187)

Self-rationalizing bullshit for the dying breed of PC builders/framerate jockeys. Waive your e-penis elsewhere bro.

Upgrade Thinkpad W500? (0)

Anonymous Coward | about a year ago | (#43609685)

No I'll wait for the second generation of Haswell because 27-watts is still to hot for a mobile cpu. Maybe I'll get the chance to upgrade my Thinkpad W500 in 2016 with a Thinkpad W5??.

Intel Graphics are turds (-1)

Anonymous Coward | about a year ago | (#43609857)

Chips have always been turds. I'm not expecting much different here.

GPU with integrated 128-core CPU (2)

Impy the Impiuos Imp (442658) | about a year ago | (#43609911)

Wouldn't these kinds of things be more accurately described as GPUs with integrated CPUs?

It's been 10 years since Intel started panicking when they realized a Pentium core could be tucked into a tiny corner of a GPU, as far as transistor count went.

Amazing! (1, Insightful)

DarthVain (724186) | about a year ago | (#43610285)

Wow so rather than the 11 FPS you were getting, you might be able to get 22 FPS!

You can almost play a video game at those speeds! Well done!

Re:Amazing! (1)

beelsebob (529313) | about a year ago | (#43611141)

Actually, if you look at some Benchmarks [anandtech.com] You'll see that the current (Ivy Bridge) chips play current games at 30-40fps (even Crysis on pretty high detail settings). So you're looking at 60-100fps with Haswell's GT3e.

Re:Amazing! (1)

DarthVain (724186) | about a year ago | (#43612161)

Not sure I would trust those benchmarks.

First: Metro is all on lowest settings and scores between 11-25FPS
Second: Crisis I am not sure how performance is higher than mainstream... unless "performance" is just a nicer way to say lowest quality, highest performance.
Third: The resolutions they are talking about are 1366x768 which might have been relevant 10 years ago, but are hardly what I would call modern.

So if you are saying that these integrated solutions will barely play modern games if at all on their lowest settings, then yes OK I agree.

What I was making fun of is that saying something has 1.5-3x the power, isn't so significant when it is multiplied by very little in the first place.

One could argue that the increase will allow some casual lower end gaming on integrated video to be improved. However one might also argue that newer modern games that come out will also *gasp* have higher demands on video anyway, and so your gains are moot. Unless you just want to play old games etc...

Then again if you are worried about video, you likely want to play games, and the new ones that come out.

Don't get me wrong, I think this is a good step, and one that will eventually see a level of integration that may allow for real integration of video for gaming purposes. Particularly now that new systems can have 64bit OS/CPU and more than 3.5GB of memory to possibly share with the video. The trouble is sharing all that bandwidth across a single architecture, which seems to be where all the current development is.

Re:Amazing! (1)

beelsebob (529313) | about a year ago | (#43612555)

First: Metro is all on lowest settings and scores between 11-25FPS

Okay, so one single game shows poorish performance, and will show entirely adequate performance on Haswell GT3e.

Second: Crisis I am not sure how performance is higher than mainstream... unless "performance" is just a nicer way to say lowest quality, highest performance.

In other words, it manages 60fps already on lowest settings, and will do 120-150fps on Haswell, and manages 30-40fps on decent settings, and will manage 60-80 on Haswell.

Third: The resolutions they are talking about are 1366x768 which might have been relevant 10 years ago, but are hardly what I would call modern.

Actually, 1366x768 is the single most common resolution out there these days on new machines. This is more true on laptops, which these IGPs are aimed at, where it ships on 95% of machines. Sure, us geeks are likely to want shiny retina mac book pros with 2880x1800 screens, but for the vast majority of people these chips will open up gaming at decent quality levels on integrated chips.

Re:Amazing! (1)

DarthVain (724186) | about a year ago | (#43612981)

I suppose. I wasn't really thinking laptops. But you are right on laptops integrated video is more common, and dedicated is horribly expensive for even a middling card. Which is why I would not really consider gaming on a laptop.

So I suppose as a target, it will make some difference for those that wish to play some games on their laptop. At any rate it will raise the bar as to which games are possible VS impossible to play on a regular laptop that doesn't cost 2 grand.

Sigh... And on the general computing side... (2)

Gordo_1 (256312) | about a year ago | (#43610593)

Haswell parts are expected to be 10-15% faster than Ivy Bridge, which was itself barely any faster than Sandy Bridge.

Anyone remember the days when computing performance doubled or even tripled between generations?

I have a desktop PC running a Sandy Bridge i5-2500K running at a consistent 4.5GHz (on air). At this rate, it could be another couple generations before Intel has anything worthwhile as an upgrade... I suspect that discrete-GPU-buying home PC enthusiasts are going to continue to be completely ignored going forward while Intel continues to focus on chips for tablets and ultrabooks.

Re:Sigh... And on the general computing side... (0)

Anonymous Coward | about a year ago | (#43610877)

Actually the IPC on ivy bridge is something like 30% faster than sandy bridge.. Its just that the IVB parts don't overclock as well (incidentally, about 30% worse than their SB counterparts)

For normally clocked parts this is great. They're faster than SB, and consume less power. This makes them great in laptops and normal desktops. For enthusiasts, its a wash. When you overclock either SB or IVB i5 and i7 K series cpus their overall performance is about the same.

Re:Sigh... And on the general computing side... (1)

s7uar7 (746699) | about a year ago | (#43611193)

I use a 5 year old Q6600 for my everyday computing and only now starting to feel a little sluggish. I suspect that with more RAM and/or an SSD it could quite happily chug along for another 2 or 3 years.

Re:Sigh... And on the general computing side... (1)

MachineShedFred (621896) | about a year ago | (#43611281)

Intel has publicly stated many many times that they are going to a "tick - tock" development cycle, where they don't expect people to buy new every release. This is a "tick" which introduces new features and performance improvements, followed next year by the "tock" which will include a die shrink and even more power savings, with a slight bump in performance and mostly the same features.

This is why most Sandy Bridge main boards just needed a firmware update to support Ivy Bridge - the were mostly the same processor.

Re:Sigh... And on the general computing side... (1)

Gordo_1 (256312) | about a year ago | (#43611771)

Ok, but Haswell delivers virtually nothing computing performance-wise over a reasonably overclocked Sandy Bridge which is a full tick-tock cycle earlier.

Re:Sigh... And on the general computing side... (1)

Hadlock (143607) | about a year ago | (#43611833)

Modern CPUs are so fast nowadays that the bottleneck is feeding them, not processing the data. The bus between memory and the CPU allows it to process 24GB/s, but most computers only come with 2, 4 sometimes 8GB. And then there's the issue of reading from the drive. The only way you're going to consistently peg an i5 for more than a few second (let alone an i7) is crunching terabytes of scientific data. Otherwise it's a software issue like Kerbal Space Program which only uses 50% of one core instead of 80% of all cores.

frist 4sot (-1, Flamebait)

Anonymous Coward | about a year ago | (#43611621)

clothes or be a same worthlees may be hurting the is wiped off and

a bit late? (2)

slashmydots (2189826) | about a year ago | (#43611801)

Wow, that will bring it up to almost the same speed as the 2 year old AMD APUs if I'm not mistaken, lol. Talk about being behind! I was amazed when the first couple P-series graphics adapters came out built into the first Sandy Bridge chips. You could actually play Skyrim on an i3-2100 at low settings and it killed at HD video playback. Now for $110, my demo unit at my shop is an A10 APU with a 6.9 graphics rating. It can play Starcraft II at almost maxed settings at 60FPS at 1280x1024 and the CPU is just a hair slower than an i5-2400 for a crap-ton less money. At least $60 if I remember correctly. The 1866 native memory controller helps too since even Ivy Bridge only hits 1600. So then it's better at video encoding and gaming than any i5. Intel is really playing catch-up at this point. I don't know why everyone's all doom and gloom over AMD getting crushed by Intel. I would have thought that was over by now, especially after releasing Trinity, Zambezi, and Vishera. Those 3 really are better than Intel in almost every way.

..but what does that come out to? (1)

tech.kyle (2800087) | about a year ago | (#43611875)

Twice nothing is still nothing, although getting a 1.5x increase for low-wattage applications is nice to hear.

End result, I'm still going to get people complaining to me that The Sims runs slow. Only difference is it'll be stop-motion instead of slideshow.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...