Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Core i5 and i3 CPUs With On-Chip GPUs Launched

timothy posted more than 4 years ago | from the two-links-is-plenty dept.

Intel 235

MojoKid writes "Intel has officially launched their new Core i5 and Core i3 lineup of Arrandale and Clarkdale processors today, for mobile and desktop platforms respectively. Like Intel's recent release of the Pinetrail platform for netbooks, new Arrandale and Clarkdale processors combine both an integrated memory controller (DDR3) and GPU (graphics processor) on the same package as the main processor. Though it's not a monolithic device, but is built upon multi-chip module packaging, it does allow these primary functional blocks to coexist in a single chip footprint or socket. In addition, Intel beefed up their graphics core and it appears that the new Intel GMA HD integrated graphics engine offers solid HD video performance and even a bit of light gaming capability."

cancel ×

235 comments

Sorry! There are no comments related to the filter you selected.

upgrade treadmill (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#30638770)

stop it, I want to get off!

Re:upgrade treadmill (1)

OrangeTide (124937) | more than 4 years ago | (#30639146)

don't worry, there is little doubt that this is a downgrade. (except for Atom owners)

Re:upgrade treadmill (1)

PopeRatzo (965947) | more than 4 years ago | (#30639894)

don't worry, there is little doubt that this is a downgrade. (except for Atom owners)

Care to elaborate on how an i5 on a laptop is a downgrade for anyone?

To GPU bandwidth? (1, Interesting)

Anonymous Coward | more than 4 years ago | (#30638812)

Itching to see how good these chips are at some number crunching on the GPU portion. I've always had an issue with the traditional bandwidth of system memory to GPU memory. That northbridge pisses me off.

I realise these particular chips are mobile processors.

Re:To GPU bandwidth? (0, Troll)

William Robinson (875390) | more than 4 years ago | (#30638948)

I realise these particular chips are mobile processors.

I didn't RTFA, but I always wished to use Intel chips for mobile/portable devices. Last time I checked they were so power hungry almost making them useless for battery based applications (maybe except notebooks/laptops where huge battery could be afforded.) Their Canmore chip almost consumed over 30W of power...settling down only for STBs and Gaming stations.

Re:To GPU bandwidth? (1)

Narishma (822073) | more than 4 years ago | (#30639116)

By mobile the parent probably meant laptop.

Intel branding considered harmful (5, Insightful)

wisty (1335733) | more than 4 years ago | (#30638822)

Grrr ... I wish Intel would go back to their system of giving new names to new chips then adding a MHz (and if that's not enough, maybe a cache size and number of cores) to distinguish them, rather than using a weird combination new names (for their top-tier chips) and old names (for their low-end gear).

I only just realized that Pentium no longer means "crappy NetBurst", but now means "low end C2D". And later this month, there will be "Pentiums" and even "Celerons" built on the same architecture as the i5. How do you let your friends know that the "Pentium" is either a worthless, power-hungry dinosaur; or a cheap version of the i5? Should people memorize the chip serial numbers? Because that seems to be the only way of figuring out what the chip is these days.

Re:Intel branding considered harmful (5, Informative)

NoNickNameForMe (884862) | more than 4 years ago | (#30638880)

That is not the only problem nowadays, even processors within a given family may or may not have specific features (VT, for example) disabled. You'd think that there is a conspiracy going on...

Re:Intel branding considered harmful (4, Informative)

cowbutt (21077) | more than 4 years ago | (#30639094)

Even worse than that, at least one model, the Q8300 Core2Quad both does and does not have VT, depending on the sSPEC code; SLB5W doesn't [intel.com] , SLGUR does [intel.com] . Good luck trying to buy one of those online and being sure of what you're gonna get!

Re:Intel branding considered harmful (0)

Anonymous Coward | more than 4 years ago | (#30639134)

All the 5xxx Pentium Dual Cores as well AFAIK. It's turning into a really annoying crapshoot, especially given that a number of retailers have had cpu mobo combos on sale with no way to determine which SSPEC they're actually stocking :(

Re:Intel branding considered harmful (0)

Anonymous Coward | more than 4 years ago | (#30639524)

As was the case with Core 2 Duo T5x00/7x00

Re:Intel branding considered harmful (3, Funny)

Anonymous Coward | more than 4 years ago | (#30639794)

It's just Schroedinger's processor. You have to watch it closely to permanently enable or disble VT.

Re:Intel branding considered harmful (1)

mvar (1386987) | more than 4 years ago | (#30638882)

Indeed things back then were so much more simple. You had pentium & celeron, now you have celerton, celeron dual-core, c2d, c2q, i5, i7 not to mention all those different cores..oh god

Re:Intel branding considered harmful (0)

Anonymous Coward | more than 4 years ago | (#30639614)

you missed i3, atom and xeon (in it's various X and W guises) ....

Re:Intel branding considered harmful (1)

dingen (958134) | more than 4 years ago | (#30638906)

I fully agree with this, it's absolutely impossible to fully understand Intel's CPU product line up. And why make all those different models anyway? I understand you have a branch of products focussing on power consumption and another on speed, but the current amount of different processors, brand names, code names, series, serial numbers is completely insane. Especially, as you point out, because the meaning of these names keep changing all the time!

Re You're an idiot. (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#30638972)

As a 49 year old feminist grandmother I think and consider you're a fool, most people don't need to be so coreful in the first place... and if you're such a cadhead that you actually might need more than 2 cores you can investigate what the model numbers correspond to. Quit whinging, there are bigger things to worry about in life than looking up some specs once a year.

Re:Intel branding considered harmful (4, Interesting)

hairyfeet (841228) | more than 4 years ago | (#30638988)

That is one of the reasons I ended up switching to AMD. With Intel it was getting to be a PITA to figure out which were the "good" chips, which were the "okay" chips and which were the cheapos. Especially since some of their chips have VT and some don't. I like how AMD only has three lines-Phenom = (best) Athlon = (good) and Sempron = (cheapo). Plus I remember what it was like when Intel was a monopoly and do NOT want to go back!

And lets be honest, once we hit dual cores for the average Joe the PC ha passed good enough a few miles back. Checking the logs on my customer's PCs on followup even the duals are spending a good amount of their time twiddling their thumbs, because the average user just doesn't come up with enough work to keep them fed. And with the economy in the crapper my customers like how cheap the new AMDs are. Hell you can get a quad for $99!

And as far as these new chips go, does Intel want to get a monopoly charge dropped on it? I mean here they are, being investigated left and right, and the come out with a whole new line of chips with onboard GPUs which looks like it is just another shot at locking out Nvidia. It sure as hell smells to me like trying to lock up the chipset market for themselves. I predict if Intel doesn't get a serious smack down from the EU or Justice Dept that it is gonna end up just them and AMD unless Nvidia buys Via and tries to get in the game that way. Does ATI even make chipsets for Intel boards since being bought by AMD? I know they locked Nvidia into the dead end LGA775 and basically give up. So is there anyone besides Intel making chips for the new socket?

Re:Intel branding considered harmful (1)

71bigblock (633491) | more than 4 years ago | (#30639330)

Nvidia has a license to make QPI-based chips. whether they have any real plans to do so.....

Re:Intel branding considered harmful (1)

tyrione (134248) | more than 4 years ago | (#30639376)

Nvidia has a license to make QPI-based chips. whether they have any real plans to do so.....

No they do not. That was the reason they bailed out of the controller market.
Intel licensed SLI from Nvidia for their QPI based Core processors.

SANTA CLARA, CA—AUGUST 10, 2009—NVIDIA Corporation today announced that Intel Corporation, and the world’s other leading motherboard manufacturers, including ASUS, EVGA, Gigabyte, and MSI, have all licensed NVIDIA® SLI® technology for inclusion on their Intel® P55 Express Chipset-based motherboards designed for the upcoming Intel® Core i7 and i5 processor in the LGA1156 socket.

http://www.electronista.com/articles/09/12/16/ftc.ignores.amd.settlement.in.intel.suit/ [electronista.com]

The FTC notes the lawsuit is not a direct antitrust case and only accuses Intel of violating competition and monopoly rules under Section 5 of the FTC Act. As a result, it prevents other companies from 'piggybacking' on the lawsuit by using an antitrust decision to demand triple damages in any private cases. In pursuing the complaint, the government commission is hoping to ban Intel from engaging in unfair bundling, pricing and exclusionary licenses and could, if victorious, force Intel to allow NVIDIA chipsets like the GeForce 9400M and Ion for Core i3, i5 and i7 processors as well as future Atom designs. Companies like Apple have faced the possibility of mandatory major reworkings of their computers to continue using modern processors.

Re:Intel branding considered harmful (1)

DrMrLordX (559371) | more than 4 years ago | (#30639336)

No, ATI/AMD does not make LGA1156 or LGA1366 motherboard chipsets. Nobody but Intel does, in fact.

Not that different (5, Informative)

Anonymous Coward | more than 4 years ago | (#30639436)

Intel also has three lines that more or less directly correspond to AMDs: Core/Phenom (good), Pentium/Athlon (ok) and Celeron/Sempron (cheap), plus the server Xeon/Opteron. The real pain is the amount of different model numbers and numbering schemes. The secret decoder ring for Intel models is:

A) old three number codes
E.g. Pentium 965, Celeron 450, ...
First digit is the model, second digit corresponds to the speed
These are usually old crap and should be avoided. Celeron 743 and Celeron 900 fairly recent low-end chips that you can still buy.

B) Letter plus four numbers codes, e.g. SU7300:
* S = small form factor
* U = ultra-low voltage (5-10W), L = low-voltage (17W), P = medium voltage (25W), T = desktop replacement (35W), E = Desktop (65W), Q = quad-core (65-130W), X = extreme edition
* 7 = model line, tells you about amount of cache, VT capability etc. Scale goes from 1 (crap) to 9 (can't afford).
* 3 = clock frequency, relative performance within the line. Scale from 0 to 9.
* 00 = random features disabled or enabled, have to look up for specific details.

C) New Core i3-XYZa
Similar to scheme B, with the added dash and more confusing
* i3 = Line within Core brand, can be i3 (cheap, but better than Celeron or Pentium), i5 (decent) or i7 (high-end)
* X = the actual model, tells you the amount of cache and number of cores, but only together with the processor line (i3-5xx is very different from i5-5xx)
* Y = corresponds to clock speed, higher is better
* Z = modifier, currently 0, 1 or 5 for specific features
* a = type of processor: X = extreme, M = mobile, QM = quad-core mobile, LM = low-voltage mobile, UM = ultra-low-voltage mobile

Re:Not that different (1)

Rudeboy777 (214749) | more than 4 years ago | (#30639882)

This secret decoder ring is exactly where Intel has got it all wrong. Who needs to charge them with antitrust violations when the marketing and product departments will run them into the ground anyway

Re:Intel branding considered harmful (2, Interesting)

Kjella (173770) | more than 4 years ago | (#30639546)

And as far as these new chips go, does Intel want to get a monopoly charge dropped on it?

The writing has been on the wall for a while, it will all be integrated into one chip at least on the low end. Oh sure Intel might get slapped one way or the other but by the time the dust settles it'll all be on a <30nm chip and no court will manage to force them to create discrete chips again.

The other part is games but the chips are running ahead of eyes and displays and developer time, if you looked at the latest reviews they only test at 2560x1600 with full AA/AF. I'm sure Fermi will be impressive but 30" displays is a tiny niche and the rest don't need it.

nVidia is talking about supercomputers and GPGPU but they're going the way of Cray and SGI, into some niche where they'll slowly wither away. AMD will hang in their because their CPU/GPU combos beat Intel on the GPU part.

Re:Intel branding considered harmful (1)

TheKidWho (705796) | more than 4 years ago | (#30639746)

Last I checked, 1080p displays where becoming the norm for PCs...

Re:Intel branding considered harmful (3, Informative)

Rockoon (1252108) | more than 4 years ago | (#30639892)

1080p is quite a bit less than the 2560x1600 that the poster was talking about. In consumer terms, its comparing 2 megapixels vs 4 megapixels.

Also, last I checked, the largest PC gaming segment still runs at 1280x1024 (presumably on commodity 5:4 aspect LCD's which stormed the market several years ago.) Only 12% run at 1080p or higher resolution. (source [steampowered.com] )

The 512MB NVIDIA 8800GT is probably still the best bang-for-your-buck card on the market given the resolutions people are gaming at. The 8800GT handles every game you can throw at it just fine at 1280x1024.

Re:Intel branding considered harmful (0)

Anonymous Coward | more than 4 years ago | (#30639902)

Don't be dense. Reducing component count by integrating functionality provides enormous improvements in cost, reliability, power efficiency etc. There is no good reason for external GPUs except the immaturity of the technology.

Re:Intel branding considered harmful (2, Funny)

Anonymous Coward | more than 4 years ago | (#30638950)

I think the CPU lineup goes like this:
8088, 8086, 80286, 80386, 80486, Pentium, Athlon, umm not sure if there is anything faster than that.

Sockets and mobos (1)

grimJester (890090) | more than 4 years ago | (#30638958)

The average consumer has little chance of realizing an i7 may need a 1156 or a 1366 socket depending on what the model number is. Those should really have been named differently.

Re:Sockets and mobos (3, Informative)

beelsebob (529313) | more than 4 years ago | (#30639006)

The average consumer doesn't give a shit what socket their CPU is in either, so it's all okay.

Re:Sockets and mobos (1)

phillips321 (955784) | more than 4 years ago | (#30639078)

I bet they do give a shit when they try using a hammer to fit a 1366 pin into a 1156 socket!

Re:Sockets and mobos (1)

mooglez (795643) | more than 4 years ago | (#30639096)

I bet they do give a shit when they try using a hammer to fit a 1366 pin into a 1156 socket!

Average consumers don't try to build their own computer from parts.

Re:Sockets and mobos (2, Insightful)

Rockoon (1252108) | more than 4 years ago | (#30639140)

..and now we know why.

Re:Sockets and mobos (2, Insightful)

beelsebob (529313) | more than 4 years ago | (#30639248)

No, the why is because they're not interested.

You don't build your own car, why? Because you're not interested in building cars.

You don't build your own house, why? Because you're not interested in building houses.

They don't build their own computers, why? Because they're not interested in building computers.

Re:Sockets and mobos (2, Insightful)

DrMrLordX (559371) | more than 4 years ago | (#30639362)

That's a faulty comparison. Cars and houses take many well-trained hands to build, whereas a PC can be built by a single individual with little to no training in a few hours time (or less). I don't change my oil, I don't paint my house, hell I can't even fix the leaky faucet downstairs, but I can certainly build my own PC.

Re:Sockets and mobos (1)

Deosyne (92713) | more than 4 years ago | (#30639454)

Yeah, that PC can be built with little training if you hand that person the exact collection of parts that they will be assembling and then supervise them. Picking parts, on the other hand, is a whole other ball of wax. I'm a systems engineer whose job it is to keep apprised of new PC technologies and yet it took me quite a while to determine the final configuration for my latest build. There are a crapton of different options for most components, which is awesome for those of us who spend a lot of time learning about them as we can get exactly what we are looking for, but it is totally intimidating to the vast majority of people who can only tell you what kind of computer they have by the brand name of the case.

Re:Sockets and mobos (1)

jonbryce (703250) | more than 4 years ago | (#30639412)

They buy a computer that is described as having a Core i7, and would like to know whether or not that allows them to run XP Mode in Windows 7. They don't care which type of socket the motherboard comes with.

Re:Intel branding considered harmful (0)

Anonymous Coward | more than 4 years ago | (#30638974)

You could compare Pentium 4s to each other pretty reliably based on clock speed. Sure, the Northwoods were a bit faster than the Prescotts, and the Extreme Edition chips had a nice speed boost from the cache, but generally clockspeed made em' match up.

However, turbo boost / new architectures can give a 50% speed boost on tasks like x264 encoding when you're talking core 2 vs i5. The frequent changes necessitate new naming schemes.

Re:Intel branding considered harmful (1)

wisty (1335733) | more than 4 years ago | (#30639160)

You could compare Pentium 4s to each other pretty reliably based on clock speed. Sure, the Northwoods were a bit faster than the Prescotts, and the Extreme Edition chips had a nice speed boost from the cache, but generally clockspeed made em' match up.

However, turbo boost / new architectures can give a 50% speed boost on tasks like x264 encoding when you're talking core 2 vs i5. The frequent changes necessitate new naming schemes.

That was the problem, I think. They were rubbing Moore's law in people's faces. Now, the i7 can always be $999, the i5 can always be $250, and the Pentium can be always be $100. People won't feel like idiots for lashing out on a marginally more powerful system, because their i7 will always have a superior sticker to the lowly Pentium.

Re:Intel branding considered harmful (1)

sznupi (719324) | more than 4 years ago | (#30639358)

What? With Pentium it was easy - there was a year or so long break between using this brand for Netburst and for Core architecture. For around 2 years already anything new & under Pentium brand gives you nice, cheap, C2D CPU...perfect in typical laptops. Yes, it's slightly slower, but together with Intel GFX and slow HDDs it doesn't matter.

Intel of course wasn't really promoting those CPUs, wishing from you to overpaid for full C2D, but they weren't secretive about them either.

Video decoding under Linux (1)

sajjen (913089) | more than 4 years ago | (#30638832)

What's the state of video decoding support under Linux for these integrated GPUs? I've been looking for something to update my HTPC with...

Re:Video decoding under Linux (4, Informative)

0100010001010011 (652467) | more than 4 years ago | (#30638978)

Not sure about Intel. But Nvidia has VDPAU which is very nice. Feature Set C even added MPEG4 decoding and SD content upscaling, all in GPU (http://en.wikipedia.org/wiki/VDPAU#NVIDIA_VDPAU_Feature_Sets)

Broadcom finally released Crystal HD drivers for Linux, which means if you have a mini PCI-E slot, you can get HD content. (http://xbmc.org/davilla/2009/12/29/broadcom-crystal-hd-its-magic/)

If you want to know what is available for what GPU/Platform, keep an eye out on the XBMC guys are doing. They seem to be at the forefront of getting hardware acceleration working on different setups
http://xbmc.org/wiki/?title=Hardware_Accelerated_Video_Decoding [xbmc.org]

Re:Video decoding under Linux (1)

Kjella (173770) | more than 4 years ago | (#30639184)

Not sure about Intel.

I don't know about these chipsets, but the current Intel chipsets with HD acceleration support like Poulsbo and GMA X4500HD have had extremely poor Linux support. nVidia and VDPAU was really the first viable solution.

Re:Video decoding under Linux (4, Interesting)

daoshi (913930) | more than 4 years ago | (#30638998)

I just got a HTPC for me this Xmas. It's Intel Atom N330 dual core + Nvidia ION You can either build it youself or buy a system from some of the vendors. If you build youself, it's cheaper and you can get a much bigger hardrive (1TB), the pre-built systems these days usually ship with 320GB HD. But they usually got a better form factor. Mine got pefect and smooth 1080p playback. I use XBMC (xbmc.org) on ubuntu 9.10 You just need to install the lastest Nvidia driver: https://launchpad.net/~nvidia-vdpau/+archive/ppa [launchpad.net] Get youself a MCE remote.

Re:Video decoding under Linux (1)

sajjen (913089) | more than 4 years ago | (#30639458)

I'm currently running a HTPC based on a AMD Athlon 64 X2 4850e on a 780G based motherboard. All media is stored on a NAS. This setup works great for 720p, but it's not as silent as I'd like. I've seen a few postings about XBMC on BeagleBoard, but it doesn't seem to be in a functional state as of yet. What I'd really like is to have a passively cooled box that's able to play 1080p H.264.

Re:Video decoding under Linux (1)

MemoryDragon (544441) | more than 4 years ago | (#30639496)

Passively cooled, forget it unless you can find a Tegra based system (which do not exist out of the box) Tegra can do it, but no one pushes out boards for that, because it is ARM based, and you know how easy it is to get a decent ARM board (outside of the beagle board there are none you can get), the ION is the closest you can get, add a silent active cooling and you are off even with the cheapass ATOM processors.
You might be better off in a half years time, NVidia is working on a Via based ION solution which would be better than the ATOM core.

Re:Video decoding under Linux (1)

asc99c (938635) | more than 4 years ago | (#30639748)

Passive cooling hopefully isn't necessary. Consider just getting a big HSF with a big fan, to run at minimum speeds. I've got a Core 2 E6400 HTPC in a Silverstone LC11-M case. I could unplug every single case fan, set the stock cooler to the lowest possible speed (about 920rpm) and play videos without it overheating, or even getting close to overheating. At that point the noisiest item was the hard disc, even though its got soundproofing panels around it.

I've recently bought a 30GB SSD to replace the noisiest bit, along with a Nexus LOW-7000 cooler. I've now got the single 120mm fan on the Nexus running about 750 rpm as the only moving part. I know it isn't technically going to be silent, but it is now silent to my ears at least. I think big fans are the way forward rather than passive cooling. I've also got an older 80mm Papst fan in my desktop PC, which at minimum speed I just can't hear, even with my ear 1 inch from the fan. Have a look for Papst 8412NGLE - over about 10 'silent' fans I've bought over the years, this seems to be the one that remarkably does do what it says.

Re:Video decoding under Linux (1)

threephaseboy (215589) | more than 4 years ago | (#30639762)

What I'd really like is to have a passively cooled box that's able to play 1080p H.264.

You mean like this? [newegg.com]

Or anything under Linux? (1)

MacroRodent (1478749) | more than 4 years ago | (#30639068)

Is the graphics unit a derivative of the notorious Poulsbo (no good open-source Linux support), or of GMA9xx (open drivers on Linux)?

As a 49 year old feminist grandmother (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#30638910)

It doesn't surprise me that this technology wasn't invented in that racist slave faleva dubai.

Netbook (1)

jlebrech (810586) | more than 4 years ago | (#30638914)

So it's a netbook cpu/gpu combo? On a desktop isn't that a waste of transistor, because who will use this GMA POS.

Re:Netbook (1)

DarkofPeace (1672314) | more than 4 years ago | (#30638930)

businesses.

Re:Netbook (2, Interesting)

beelsebob (529313) | more than 4 years ago | (#30639016)

No, it's a laptop CPU/GPU combo, these things are aimed squarely at high end laptops like MacBook Pros.

Re:Netbook (0)

Anonymous Coward | more than 4 years ago | (#30639444)

these things are aimed squarely at high end laptops like MacBook Pros.

High end? "offers solid HD video performance" is hardly a boast you'd make about high end hardware, even for a laptop. That's basic functionality. This only makes sense if you're comparing it with netbook performance.

Re:Netbook (1)

beelsebob (529313) | more than 4 years ago | (#30639548)

no, it makes sense to advertise it when it's a single chip that does everything. Netbook CPUs don't offer *any* video decoding support (other than software), that's all done on the chipset (assuming you actually got an ion).

What does "light gaming capability" mean? (0)

Anonymous Coward | more than 4 years ago | (#30638924)

How does it stack up compared to nVidia's chips? I would consider "light gaming capability" to imply pre-GeForce performance; is that what the editors intended to convey?

Re:What does "light gaming capability" mean? (2, Informative)

hairyfeet (841228) | more than 4 years ago | (#30639040)

Well if you read the specs here [hothardware.com] you will see that it has 12 execution units, which I'm guessing is Intel speak for stream processors, which considering a $30 [newegg.com] ATI card has 320, I'm guessing like all Intel GPUs its gonna be of the uber-suck.

About the only ones I saddle piss poor Intel GPUs on anymore is the housewives, who at most are playing a browser game on Facebook. Everyone else gets an Nvidia or ATI onboard so if they decide to do a little light* gaming they can.

* The new ATI onboard GPUs are surprisingly good at gaming. I personally was playing Bioshock and Swat 4 on my 780v until I could get time to order a 4650 discrete. While these games aren't cutting edge, the fact that an onboard could actually game blew my fricking mind! Compared to the horrible chips that Intel calls GPUs it was actually nice, and it had full hardware acceleration for the most popular formats out of the box. I was impressed, and unlike so many horror stories I had heard the ATI drivers were just as solid and stable as could be.

Solid huh? (4, Insightful)

John Betonschaar (178617) | more than 4 years ago | (#30638966)

In addition, Intel beefed up their graphics core and it appears that the new Intel GMA HD integrated graphics engine offers solid HD video performance

Solid HD video performance? I see 35% CPU load in the Casion Royale 1080p trailer screenshot, on a fast Quad-core CPU. My puny single-core Atom 1.6Ghz with NVidia graphics does 6-10% max on any 1080p content I throw at it in XBMC.

It's better than what Intel offered before: nothing, but I still wouldn't recommend Intel graphics for any HD video player.

Unrealistic expectations (-1)

anti-NAT (709310) | more than 4 years ago | (#30639000)

So when you're watching your HD video on your Atom 1.6Ghz with NVidia Graphics, what do you use the other free 90% of the CPU for?

How much more do you have to pay to have a freed up 90% CPU that you're probably not using for anything else at the time?

It seems to me you're in a race to the bottom - how much can you spend to use less of what you've paid for. IOW, your goal seems to be to get less bangs per buck, not more.

Multitasking (1)

aclarke (307017) | more than 4 years ago | (#30639108)

It's conceivable that one might want to be, say, ripping or transcoding one movie while watching another. Or running a web server while watching a movie. Maybe you want to watch a movie while you're compiling some code, so you want extra CPU for that. There are any number of things one might want to use one's CPU for while watching a movie.

Re:Multitasking (2)

chrisG23 (812077) | more than 4 years ago | (#30639218)

Yo dawg, I heard you like watching movies, so we made a computer powerful enough for you to watch a movie in hi-def, while you watch a movie in hi-def.

Sure -- theoretically (3, Insightful)

anti-NAT (709310) | more than 4 years ago | (#30639294)

"In theory, practice and theory are the same. In practice they aren't"

Most people don't multitask on their desktop, or better described, "significantly multitask", meaning run multiple programs are once that are intensively using the CPU(s). Typically, they're running one application, which they're focussed on, and other background applications, while they are running, are mostly idle, or utilising no more than the occasiona few percent.

Ripping a movie, on an Atom CPU PC (likely a netbook) at the same time as watch one? I think that's an unlikely event.

Running a highly trafficed web server, on an Atom CPU? I think that's even less likely that ripping a move while watching one.

Remember the OP's criticism? 35% CPU utilsation, which of course still allows 65% CPU for any other tasks, such as ripping a movie, running a web server etc. was unacceptable. So how much unused CPU is enough for more than likely theortical, rather than in practice, use? 70%, 80%, 90%? Any free CPU is CPU you've paid for but aren't getting any value from. The greater the unutilised CPU percentage, the less value for money you're getting.

People buy CPU capacity based on their peak usage, not their average usage. My fundamental point, and why I agree with "Solid HD" performance, is that the typical high load use of a PC while watching a movie is only watching that movie. If these new Intel CPUs with GPUs still have 65% capacity left while the movie is playing, you could say they're significantly overspec'd for their likely peak use - by 65% or so percent.

Re:Sure -- theoretically (3, Insightful)

NervousNerd (1190935) | more than 4 years ago | (#30639328)

I wouldn't quite say that. Lower CPU utilization equates to less electricity utilized, which equates to a lower power bill. I would much rather have a processor use only 15% of my CPU's resources in order for me to be able to view a movie instead of having to use 85% of the CPU's resources in order to view a movie.

Re:Sure -- theoretically (0)

Anonymous Coward | more than 4 years ago | (#30639550)

your video decoder runs on air ?

Re:Sure -- theoretically (0)

Anonymous Coward | more than 4 years ago | (#30639768)

Because everybody knows GPUs don't consume power...

Re:Sure -- theoretically (2, Interesting)

jonbryce (703250) | more than 4 years ago | (#30639428)

They might be watching a video while touching up their photos in Photoshop. That's probably the most likely heavy use scenario.

Re:Sure -- theoretically (1)

e70838 (976799) | more than 4 years ago | (#30639722)

I have two screens attached to my PC. This seems standard nowadays. A usual configuration is TV (IP, DVD, BD, ...) on one screen and work (eclipse java, virtual box running redhat and oracle, ...) on the other. Sometimes in the background, I compress some videos to DIVX.

What kind of multitask would you accept as significant ?

Re:Multitasking (2, Insightful)

sznupi (719324) | more than 4 years ago | (#30639404)

65% of Core i5 CPU is worth much more than 90% of Atom for "multitasking". Plus, those numbers aren't correlated strongly with how smooth any hypothetical multitasking will be, it's more about OS & the way apps are written.

Fewer bangs per buck... (0)

Anonymous Coward | more than 4 years ago | (#30639224)

Yeah, I know. Language is a fluid thing and we can all spell stuff how we want and use any word we choose regardless ( the last refuge of the illiterate ).

Go ahead and be dim.

Re:Fewer bangs per buck... (0)

Anonymous Coward | more than 4 years ago | (#30639298)

we can all spell stuff how we want and use any word we choose irregardless

FTFY

Re:Unrealistic expectations (1)

emj (15659) | more than 4 years ago | (#30639424)

Even if you only use 80% of your CPU I'm pretty sure it will choke at some point, because there is always something running.

Re:Unrealistic expectations (1)

John Betonschaar (178617) | more than 4 years ago | (#30639692)

Well for one, the machine can be passively cooled and will jump over 70 degrees Celsius if I tax the CPU for more then a few percent, during GPU accelerated playback it stays nicely around 60C. Also, the thing is in use as a home-server/personal web server, which means there's all kinds of stuff running in the background. 35% of a core i5 = around 300% of a single core Atom, you do the math.

Last but not least I like the idea that the most efficient part of my computer is used for the most appriopiate task. The Atom is barely able to do full-screen standard-def flash video, while the Nvidia GPU does silk-smooth 1080p content. How on eart would someone _not_ want that.

Re:Solid huh? (2, Insightful)

hedwards (940851) | more than 4 years ago | (#30639098)

You shouldn't be recommend Intel graphics for pretty much anything. Unless I missed the memo, Intel is really the worst choice for graphics cards, sure it's available on whatever platform you like, but AMD's been releasing documentation on its cards and the Intel graphics chips haven't been good. You're really far better off going with either nVidia or AMD for graphics chips at this point.

Re:Solid huh? (2, Informative)

DoofusOfDeath (636671) | more than 4 years ago | (#30639592)

You shouldn't be recommend Intel graphics for pretty much anything.

I disagree. I've had a few laptops that were primarily used for programming. On those, the basic, build-in Intel graphics (GMA950 and X3100, iirc) were just fine.

In fact, they were even better than ATI or nVidia graphics for me: those computers were running Linux, and I could always count on the Intel drivers being available for the most up-to-date Linux kernels, whereas I couldn't make that assumption for the closed-source nVidia or ATI drivers.

Re:Solid huh? (0)

Anonymous Coward | more than 4 years ago | (#30639778)

There was one, and only one worse card that is a distant memory. S3 Virge [wikipedia.org]

Re:Solid huh? (1)

MadKeithV (102058) | more than 4 years ago | (#30639130)

Solid, as in "frozen solid".

Re:Solid huh? (1)

MemoryDragon (544441) | more than 4 years ago | (#30639456)

Actually what you see os the entire cpu load including the gpu part, what you see in the second case is the pure cpu load and offloaded gpu, the end result is pretty much the same if you sum both up...

Re:Solid huh? (1)

John Betonschaar (178617) | more than 4 years ago | (#30639664)

You're wrong. What you're seeing is that apparently the Casino Royale video is not fully accelerated by the GPU. The GPU load is never factored in the Windows performance monitor, you need CPU-z or something like that.

Reviews online at anandtech.com and techreport.com (4, Informative)

IYagami (136831) | more than 4 years ago | (#30638968)

DESKTOP PROCESSORS
http://techreport.com/articles.x/18216/1 [techreport.com]
"As a CPU technology, Clarkdale is excellent. I can't get over how the Core i5-661 kept nearly matching the Core 2 Quad Q9400 in things like video encoding and rendering with just two cores. We've known for a while how potent the Nehalem microarchitecture can be, but seeing a dual-core processor take on a quad-core from the immediately preceding generation is, as I said, pretty mind-blowing. Clarkdale's power consumption is admirably low at peak
(...)
The integrated graphics processor on Clarkdale has, to some extent, managed to exceed my rather low expectations."

http://anandtech.com/cpuchipsets/showdoc.aspx?i=3704 [anandtech.com]
"For a HTPC there's simply none better than these new Clarkies. The on-package GPU keeps power consumption nice and low, enabling some pretty cool mini-ITX designs that we'll see this year. Then there's the feature holy-grail: Dolby TrueHD and DTS HD-MA bitstreaming over HDMI. If you're serious about building an HTPC in 2010, you'll want one of Intel's new Core i3s or i5s."

NOTEBOOK PROCESSORS
http://anandtech.com/mobile/showdoc.aspx?i=3705 [anandtech.com]
"From the balanced notebook perspective, Arrandale is awesome. Battery life doesn't improve, but performance goes up tremendously. The end result is better performance for hopefully the same power consumption. If you're stuck with an aging laptop it's worth the wait. If you can wait even longer we expect to see a second rev of Arrandale silicon towards the middle of the year with better power characteristics. Let's look at some other mobile markets, though.
(...)
If what you're after is raw, unadulterated performance, there are still faster options.
(...)
We are also missing something to replace the ultra-long battery life offered by the Core 2 Ultra Low Voltage (CULV) parts. "

Do Not Want! (2, Interesting)

A12m0v (1315511) | more than 4 years ago | (#30638976)

Anyone else suspicious of this? Intel trying to use its CPU monopoly to gain a GPU monopoly?

Re:Do Not Want! (1)

Anonymous Coward | more than 4 years ago | (#30638992)

Too late, Intel already owns most of the integrated GPU market. It is by far the most common vendor of GPU's in the average machine, just not in gamer machines...

Re:Do Not Want! (1)

kramulous (977841) | more than 4 years ago | (#30639012)

We breath oxygen. What do you breath there?

Re:Do Not Want! (1)

petermgreen (876956) | more than 4 years ago | (#30639256)

It will further cement thier already near monopoly in the integrated graphics for intel based systems segment. Whether it will have much impact on the gamer graphics segment depends on how well it performs. It seems that they have more or less caught up with AMD integrated graphics but I don't think that in itself is enough to seriously impact on sales of discrete graphics cards.

Unfortunately TFA jumps straight from integrated graphics to a £130 card and uses completely different settings for the two tests. What i'd really like to see is a comparison of the integrated graphics on these things with say a 8400 GS (a £25 card).

anyone here got an 8400GS and one of the games used in TFA and prepared to run some benchmarks at the settings they used for the integrated graphics test? (yeah I know the rest of the system won't match but all i'm interested in are ballpark figures)

Re:Do Not Want! (0)

Anonymous Coward | more than 4 years ago | (#30639830)

Not an official benchmark, but my old Pentium 4 (3.0ghz) with a gig of ram and an AGP GeForce 7600 runs Quake Wars Enemy Territory at around 30-45 fps. Looking up pricing on this card was rendered irrevalent when I realised that it's so old that it's getting more expensive (most AGP cards are though). Judging from gaming performance alone I'd put this card just below a GeForce 6400, which is also so old that it isn't being sold anymore. I'm sure the 8400 would blow this thing straight out of the water.

So congratulations Intel, you have produced a cutting edge graphics chip that gets half the performance of a budget gamer card released about 4 years ago, and gets itself absolutely dominated by your competitors cheapest low end crap.

This is only an article because Intel finally integrated a GPU and CPU on the same chip.

Interesting implications (3, Interesting)

rpp3po (641313) | more than 4 years ago | (#30639310)

While you might have missed that Intel already is the largest GPU vendor in the world for years (gaming is small compared to B2B sales), you are right, anyway. When offering intel CPUs implies having to buy their GPU, the air will become thin for excellent integrated chipset offerings as Nvidia's. Instead of pushing customers through secret, anti-competitive contracts, they have just changed their product lineup. Want a CPU? Fine, but you can't have it without a GPU.

It will be interesting to see, wether Apple will get special treatment. The have already semi-officially let a word slip out, that they are not interested in the Arrandale GPU and won't use it. It's just not powerful enough for their GPU-laden OS and application lineup compared to Nvidia's chipset offerings.

Re:Interesting implications (1)

TheRaven64 (641858) | more than 4 years ago | (#30639810)

And remember the last time they did this? Before the 486, there were a few x87 manufactures (including AMD). The 486 came with an integrated 487, so there was no need to buy one from a third party (they later split the line into 486sx and dx, where the sx was a 486 with the broken 487 disabled).

AMD survived by ramping up investment in their x86 clones and shifted to selling x86+x87 cores, rather than just x87 cores, as their primary market. ATi is doing the same thing by being purchased by AMD. nVidia is trying to do something similar with ARM cores in the Tegra line.

Re:Do Not Want! (3, Interesting)

MemoryDragon (544441) | more than 4 years ago | (#30639480)

Jepp they already said they want to bankrupt nvidia, every move in the last year was in this direction, first shutting out the ION chipset by illegal pricing now trying to push the gpus into the core so that the cheap enough solution ends wherever nvidia (and ATI but they are less bothered since they can do the same) got its core money from, third fighting a patent war on them to shoot them out of the chipset market.

The entire thing started when NVidia was blabbering about you dont need CPU upgrades anymore just use the GPU for everything, that woke Intel up, and as usual with cheapass solutions which are worse but cheaper they kill off the competition!
Worked in the past works again.
I wonder if we will see NVidia in 5 years at all in the PC market they might end up being a second PowerVR still healthy in the embedded sector but not at all present on the PC side of things.

What the hell... (4, Insightful)

NervousNerd (1190935) | more than 4 years ago | (#30639044)

What the hell is up with their model numbers? Quick, is that i5 you have a dual core or a quad core!? At least Intel's older Core 2 processors differentiated with "Duo" or "Quad", and AMD's simply uses "X2","X3" or "X4".

Re:What the hell... (0)

Anonymous Coward | more than 4 years ago | (#30639354)

If there is any sanity in Intel, an i5 is obviously a penta-core ... oh, no, wait ...

Re:What the hell... (2, Interesting)

Tim C (15259) | more than 4 years ago | (#30639880)

I bought an i7 as part of a general upgrade a few months ago; it wasn't until I had it installed and happened to check Task Manager that I realised it was a quad core chip.

Vista Ready ? (0)

emilper (826945) | more than 4 years ago | (#30639264)

well ... is it "Vista Ready"?

Re:Vista Ready ? (0)

DrMrLordX (559371) | more than 4 years ago | (#30639382)

More importantly, will it run Crysis?

(trick question; nothing runs Crysis, at least not smoothly at hi res with all the settings maxed out)

Re:Vista Ready ? (1)

emilper (826945) | more than 4 years ago | (#30639468)

my question was a trick, too: Intel sold quite a few onboard graphic chips as "Vista Ready" in the past; I bought one without doing my homework before, and right now I am quite cautious when it comes to Intel hype.

So here's the summary: (0)

Anonymous Coward | more than 4 years ago | (#30639322)

The new Core i5 Clarkdale-based CPUs might be interesting, but they're overpriced. These are dual core CPUs with Intel integrated graphics built into the CPU package, and they cost roughly as much as a quad core Lynfield (Core i5-750) or AMD Phenom II X4 965 CPU, both of which will trounce it in any benchmark. They are somewhat based on the Nehalem architecture, but they moved the memory controller off of the CPU core (while leaving it on the CPU package) introducing more latency and lower memory bandwidth.

The Core i3 CPUs offer more of a value proposition with prices starting under $130. These might be the chips to go for if you want an HTPC, though the CPU utilization for HD media decodes is much higher than on similar platforms (i.e., nVidia ION).

The integrated graphics performance is nothing to get excited about and is really only suitable for business use/HTPC use. You're still not going to game on this GPU, nor will it be suitable for high performance computing.

The single most interesting thing about these CPUs are the inclusion of the AES-NI instructions which accelerate AES encrypt/decrypt functions. When paired with full disk encryption solutions that utilize AES these CPUs see a roughly 15% decrease in disk performance as opposed to the usual 30% or so. Of course, you might just as well buy a quad core CPU and let the extra cores handle encode/decode too.

Realistically these are going to be used in business-class PCs. You get decent dual core performance, competent business graphics performance, and integrated support for accelerated AES functions. They might also be suitable for home brew VPN endpoint solutions with their AES acceleration and relatively low power requirements as well.

Oh yeah...while these will work in existing Socket LGA 1156 boards (with a BIOS update, of course) you will need a completely new motherboard if you want to take advantage of the integrated graphics capability, as existing boards do not have connectivity from the CPU socket to a video out port. Of course, if you have an LGA 1156 mainboard already then any of these new Clarkdale CPUs would be a downgrade, so probably no worries there.

Article is terrible (4, Informative)

sammydee (930754) | more than 4 years ago | (#30639448)

The article is awful. There is only one game benchmark and that compared to an integrated AMD GPU that hardly anybody has heard of. There is also no way of telling from the article whether the integrated intel graphics actually has HD video decode acceleration or not. The modern core i5 chips are pretty capable of decoding 1080p content by themselves without any gpu assistance.

I think the article writer misunderstands how hardware video decode assist actually works. It isn't magically engaged when you play any HD movie in any media player (usually it has to be turned on in an option somewhere with a media player app that supports it) and it isn't a sliding scale of cpu usage. Modern decoding chips either decode EVERYTHING on the card, reducing cpu usage to 1% or 2%, or the app decodes EVERYTHING in software, resulting in fairly high cpu usage.

I still have no idea if the new intel graphics chip actually offers any HD video acceleration at all. If it did, it would make it a nice choice for low power and HTPC solutions. If it doesn't, it's just another crappy integrated graphics card.

solid HD performance? (1)

StripedCow (776465) | more than 4 years ago | (#30639508)

with 20.7 frames per second?
that's not what i call solid performance...

tough day for nvidia stock (1)

tjstork (137384) | more than 4 years ago | (#30639564)

a wsj analyst has to be looking at this, and concluding that the gpu business is doomed.

Re:tough day for nvidia stock (1)

TheRaven64 (641858) | more than 4 years ago | (#30639870)

It is, and has been for a long time. It's now in the same place that the discrete math coprocessor market was in 1989. That's not necessarily a problem for nVidia for two reasons.

First, Intel is licensing the Atom microarchitecture to SoC manufacturers. They can also license a GPU core design from nVidia, and maybe a DSP design from someone else, and build their own integrated SoC with an nVidia GPU and an Atom CPU. This is Intel's attempt to compete with ARM. When you buy an ARM chip, you almost always buy a SoC that contains a few other cores to suit your particular application. The diversity in the ARM marketplace is something Intel has difficulty competing with.

Secondly, nVidia doesn't just make GPUs for x86. They also make their own line of ARM SoCs; Tegra. These take a fairly stock ARM core and combine it with a low power nVidia GPU core. I'm not sure whether they will keep developing Tegra, or whether they will just license the IP to other SoC manufacturers. The margins are higher with the first option, but the quantities are bigger with the second.

If I were in charge of nVidia, I would look seriously at following ARM's business model and transitioning to selling IP cores to SoC makers. Even Intel buys some of these - some of their chipsets incorporate a rebranded PowerVR GPU - and they could probably sell some cores to AMD and Intel, as well as the likes of Samsung, TI, and so on who produce ARM SoCs.

Just the mere mention (1)

Mattskimo (1452429) | more than 4 years ago | (#30639740)

of integrated graphics makes me shudder. Didn't we get over that in about 1999? Seriously though this looks like a fairly terrible solution unless you feel like running Vista on something the size of an iPod.

OK can someone clear this up (1)

ZERO1ZERO (948669) | more than 4 years ago | (#30639760)

I have no idea of what intel are calling their cips and which is the best etc

Can someone answer these 'simple' questions - In terms of regular geek activities, movie playing/encoding, gaming, compiling, rendering, desktop use, all the regular things

1. Which processor is the all out fastest, best (money no object)

2. Which processor is the best bang for buck (money and object)

3. how do intel chips compare to amd on the bang per buck level.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?