×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Unveils 10-Watt Haswell Chip

Soulskill posted about a year and a half ago | from the ever-more-efficient dept.

Intel 103

adeelarshad82 writes "At IDF, Intel announced the company's fourth-generation Core processor code-named Haswell. The chip is based off of the same 22nm process used in the current third-generation Core products. What makes this chip remarkably different from the third-generation chips is its ability to product twice the graphic capabilities at a much lower power consumption, which Intel has achieved by making use of a number of tactics." HotHardware has video of Haswell running a 3D benchmark.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

103 comments

First Comment (-1)

Anonymous Coward | about a year and a half ago | (#41306939)

YEAH YEAH YEAH!

Re:First Comment (-1)

Anonymous Coward | about a year and a half ago | (#41307351)

the sept 11 of comments

Compared to ARM (-1)

Anonymous Coward | about a year and a half ago | (#41307501)

10W is fine, but compared to 2-3W ARM, it is just a PC processor...

2-3W may not be the best comparison for performance per power, but the article does not event mention ARM, so it is out of context.

Re:Compared to ARM (5, Interesting)

atlasdropperofworlds (888683) | about a year and a half ago | (#41307575)

When you consider that the x86 uses 3x the power, but can run a benchmark such as multithreaded linpack 1000x faster, it suddenly seems like we're getting ripped off by these ARM processors.

In reality, this processor consumes 20x less (I assume that means 1/20th) power of the current Ivy Bridges. I presume that's under normal use. It's a huge win for laptops.

Re:Compared to ARM (0)

Anonymous Coward | about a year and a half ago | (#41307897)

Intel is not doing any better, from what I remember minefield is pretty much on par with ARM chips. The relationship between power consumption and performance is not linear. If you could linearly scale down I'm sure intel would release a chip that has the power envelope of an ARM chip but is 333.3x faster.

Re:Compared to ARM (5, Informative)

Wallslide (544078) | about a year and a half ago | (#41307975)

According to anandtech.com, the '20x lower power' statistic is only a reference to the chip's idle power state, not while it's under any sort of processing load.

Re:Compared to ARM (1)

atlasdropperofworlds (888683) | about a year and a half ago | (#41320875)

So most of the time, the processor is idle. The rest of the time, it's doing processing 1000x faster than an ARM CPU. Given that current mobile CPUs use somewhere around 60-70W under full load, this bodes well as the x86 processors are doing a lot more work for only about 10x the load power draw. When idle, the cpus draw far less.

Re:Compared to ARM (0)

Anonymous Coward | about a year and a half ago | (#41308255)

That depends if you want systems that run everyday workloads or benchmarks. Intel builds processors that generate impressive computational benchmarks, ARM processors are aimed at power and build-cost benchmarks. Deal with it.

Re:Compared to ARM (0)

Anonymous Coward | about a year and a half ago | (#41310191)

I prefer to assume that 20x less means that its power draw is minus nineteen times that of the other architecture.

Mathematics is not a language for which slang and colloquialisms can become official, valid statements.

Re:Compared to ARM (0)

Anonymous Coward | about a year and a half ago | (#41314087)

As a mathematician, such a statement is perfectly clear and unambiguous to me. I think the problem might just be with you and a small number of other people.

Closing in on Atom (4, Interesting)

PhrostyMcByte (589271) | about a year and a half ago | (#41307005)

Intel's top Atom chips have a 10W TDP. Of course the chipset/RAM also play a large factor, but still -- this is an amazingly frugal CPU

Re:Closing in on Atom (2)

All_One_Mind (945389) | about a year and a half ago | (#41307073)

Intel's top Atom chips have a 10W TDP. Of course the chipset/RAM also play a large factor, but still -- this is an amazingly frugal CPU

At IDF, Intel also talked about upcoming 5W Atom chips that will be out at the same as Haswell

Re:Closing in on Atom (4, Funny)

Anonymous Coward | about a year and a half ago | (#41307107)

The part I was impressed with was how they did it...
"[...] which Intel has acheived by making use of a number of tactics."
+5 Informative!

Re:Closing in on Atom (5, Funny)

LordKronos (470910) | about a year and a half ago | (#41307825)

The part I was impressed with was how they did it...
"[...] which Intel has acheived by making use of a number of tactics."
+5 Informative!

Welcome to the internet. We have these thing called hyperlinks. Anytime you see underlined text of a different color, you should consider clicking on it. I you had clicked on the phrase "number of tactics", you would have been taken to view another article which would have explained many of these tactics.

Re:Closing in on Atom (4, Interesting)

Unknown Lamer (78415) | about a year and a half ago | (#41307147)

The best part is that, unlike Atom, these things are usably fast. I have a 2x1.3Ghz core2 process shrunk or something with a TDP of 12W (total system is about 26W ... under full load). I mostly live in Emacs but I like a compositing window manager (wobbly windows are fun alright) and GL screen hacks... the thing does great and can handle my regular abuse of PostgreSQL/SBCL/mlton/... all while getting about 8-9 hours of realistic use (ok, closer to 7 now that the battery is down to 72Wh from its 84Wh theoretical max when it was new) and all under 10W generally. Sign me up for something that uses about the same power and is just a bit slower than the current Ivybridge processors... (come on laptop, don't die until next summer).

And it all Just Works (tm) with Debian testing (it even worked with Squeeze, but GL was a bit slow since it predated the existence of the graphics hardware and all). Now, if only someone would make one of these low voltage things with a danged Pixel Qi display or whatever Qualcomm has so that I can use it outside... not having to worry about finding power every 2 to 3 hours is great, but if you're still stuck indoors it's not half as great as it could be.

The usual k1dd13 stupidity. (-1)

Anonymous Coward | about a year and a half ago | (#41309871)

The best part is that, unlike Atom , these things are usably fast.

(Emphasis mine.)

I use Atom processors for many general duty tasks, and they perform very well. What I don't generally try to make them do is truly heavy lifting, such as high def video encoding, or hard-core gaming. That's not what those chips are for.

The reason, in case you were wondering - in case you were doing the usual fuckheaded adolescent "I've got an i7!!!" dance - is because my entire site is off-grid. Every watt counts. Shit, every mW counts.

So I run low power consumption processors and systems for general computing tasks, and for custom stuff, such as remote weather stations, and that kind of thing. Intel's Atom processors excel in such situations (as do AMD's, it must be said), so reading the FUCKWIT DROOLINGS of an official Slashdot dickhead - a very aptly named one, at that - just makes me shake my head in despair at the knowledge of how draptastic this site's "admins" have become.

So, Unknown Lamer: go back to writing your gay porn Buffy fanfic, and stop bothering people who know about computers and stuff. Please.

Re:The usual k1dd13 stupidity. (0)

Anonymous Coward | about a year and a half ago | (#41311235)

yeah, you're so cool and not like a lamer at all with your swearing and insults.

Re:The usual k1dd13 stupidity. (0)

Anonymous Coward | about a year and a half ago | (#41315613)

When are people going to learn that you cannot criticize /. staff without them retaliating like this?

Doesn't anyone remember Michael Sims?

Re:Closing in on Atom (0)

Anonymous Coward | about a year and a half ago | (#41314925)

I have last year's AMD E-350, 2x1.6GHz, 80 Radeon cores, 18W TDP; if I turn off the wifi and the bluetooth and stuff the system goes as low as 10W.

This system is good enough for Crysis, Mass Effect 3, Starcraft 2 needs low settings though due to lack of processing.

I do a lot of math in Sage. I would like something a bit faster, so this Intel news sounds good.

Re:Closing in on Atom (3, Informative)

default luser (529332) | about a year and a half ago | (#41308129)

Intel's top Atom chips have a 10W TDP. Of course the chipset/RAM also play a large factor, but still -- this is an amazingly frugal CPU

You're thinking of the wrong Atom CPU there. You want to compare Intel's lowest-power Core architecture to...their lowest-power Atom.

Intel has placed an Atom Z2460 on a smartphone platform, complete with 1.6 GHz core speed and sub 1w typical power consumption [anandtech.com] , and they've done it on just the old 32nm process. The 10w parts you're thinking of are for desktops.

These 10w Haswell chips will also be the pick of the litter, but the power consumption will be nowhere near that of Atom (and neither will the price...expect to pay upwards of $300 for these exotic cores). The Lava Xolo X900 costs only $400 on the street [anandtech.com] , so you can imagine Intel's charging around $25 for their chipset.

Graphic Capabilities (2)

Lord Lode (1290856) | about a year and a half ago | (#41307067)

So wait, is this only about the graphics part inside the CPU or what?

Who cares about that graphics part inside the CPU. Useful for a laptop maybe, but for the real stuff you need an actual graphics card.

Re:Graphic Capabilities (3, Insightful)

silas_moeckel (234313) | about a year and a half ago | (#41307089)

Because most PC's sold use integrated graphics, traditionally they have been abysmal. In the last few years seemly pushed by AMD they have been looking to correct that.

Re:Graphic Capabilities (-1)

geekoid (135745) | about a year and a half ago | (#41307139)

Pushed by Intel. AMD is following... still.

Re:Graphic Capabilities (1)

Anonymous Coward | about a year and a half ago | (#41307205)

Wrong, the only reason you have better integrated graphics from Intel is beacause AMD has been pushing performance since they bought ATI

Re:Graphic Capabilities (0)

Anonymous Coward | about a year and a half ago | (#41308729)

The only reason you have microprocessors of any kind is because Intel invented them.

Re:Graphic Capabilities (1)

Anne Thwacks (531696) | about a year and a half ago | (#41310109)

The same time Intel was making one, pretty much every IT person on the planet was predicting their coming. It was an idea whose time was come.

However, for those of us that live in the UK, there was no chance whatever of getting the funding to actually make one.

Re:Graphic Capabilities (2, Informative)

sexconker (1179573) | about a year and a half ago | (#41307207)

Pushed by Intel. AMD is following... still.

The GPU parts in AMD's "APUs" are miles beyond Intel's HD Graphics.

Re:Graphic Capabilities (1)

Bengie (1121981) | about a year and a half ago | (#41307493)

Last I checked, Intel's IGP got about 1/2 the FPS for 1/3 the TDP. I'm not sure about this Haswell version or what AMD will have to compete with Haswell.

Re:Graphic Capabilities (1)

TheRealMindChild (743925) | about a year and a half ago | (#41307521)

That 1/2 fps in a game is the difference between tolerable and unusable

Re:Graphic Capabilities (4, Insightful)

DeathFromSomewhere (940915) | about a year and a half ago | (#41307923)

1) Using integrated graphics for gaming if you are concerned about framerates is just dumb.
2) 1/3 the TDP is the difference between a battery with power and one without.

Re:Graphic Capabilities (1)

tepples (727027) | about a year and a half ago | (#41313753)

Using integrated graphics for gaming if you are concerned about framerates is just dumb.

So what 10" laptop has discrete graphics? Or should only turn-based games be played on a 10" laptop?

Re:Graphic Capabilities (0)

Anonymous Coward | about a year and a half ago | (#41314979)

You should be aware of the capabilities of a device when you buy it. If the device doesn't meet your requirements, look for one that does. Don't expect integrated graphics to run Crysis 3.

--DeathFromSomewhere (don't feel like logging in).

Underwater on a device (1)

tepples (727027) | about a year and a half ago | (#41315879)

If the device doesn't meet your requirements, look for one that does.

Which is difficult if one is still making payments on the device that no longer meets one's expanded requirements, or if someone else controls the purse-strings for a household or business and fails to appreciate the expanded requirements. It's also difficult in a case where price, performance, and size are in a "pick two" relationship.

Re:Underwater on a device (1)

DeathFromSomewhere (940915) | about a year and a half ago | (#41319543)

Which is difficult if one is still making payments on the device that no longer meets one's expanded requirements, or if someone else controls the purse-strings for a household or business and fails to appreciate the expanded requirements.

That sounds more like a personal problem than a technical one. Can't help you there.

It's also difficult in a case where price, performance, and size are in a "pick two" relationship.

Pretty well what I'm saying, except mine are price, performance and power usage.

Re:Graphic Capabilities (2)

Bengie (1121981) | about a year and a half ago | (#41308253)

the difference between 40fps and 20fps at 1080p for medium quality in a modern game... on a netbook. Intel IGP is notm eant for games, but it can run them with "meh" quality, but really really lower power usage.

Re:Graphic Capabilities (1)

petermgreen (876956) | about a year and a half ago | (#41310779)

the difference between 40fps and 20fps

Is that average FPS or minimum FPS? for some reason benchmarkers tend to focus on the former while the latter is what is really important. A game that plays at 20fps solid would probablly be tolerable (it's not much lower than movie framerates after all), one that plays at 30fps most of the time but bogs down in intensive scenes would note.

Re:Graphic Capabilities (1)

Calos (2281322) | about a year and a half ago | (#41311027)

Good news! Haswell (the GT3 variants of it anyway) should approximately double Intel's IGP performance. For example: they demoed it playing Skyrim on High settings at 1920x1080.

Plus, some other good stuff [arstechnica.com] .

Re:Graphic Capabilities (3, Informative)

Anonymous Coward | about a year and a half ago | (#41307367)

The integrated graphics are still crap.

The thermal overhead added to the CPU die limits the amount of computational power they can realistically add. Not to mention that on enthusiast systems it creates needless waste heat that could be better spent on CPU cycles. (Supposedly we'll see some tightly integrated cpu+gpu systems with shared memory space and registers and whatnot.. But we're far away from that, as it presents a large departure from traditional PC architecture, let alone x86 arch. AMD is way ahead on this path though, and it may pay off for them in the future.)

Above aside, the real elephant in the room comes down to memory speed. GPUs need memory bandwidth. Lots of it. GPU speed scales with memory bandwidth to the point that it's pretty much the most significant metric that separates price tiers in the traditional GPU card market. GPUs are supposed to have fast, high clocked, tightly integrated memory subsystems using exotic high speed memory types explicitly designed to directly coupled to GPU chips, for their exclusive use.(GDDR3, GDDR5, etc)

These cpu-gpus have to make due with the plain old low bandwidth narrow bus main memory in your system. And they have to share that bandwidth with the rest of the system. GPUs are so memory speed sensitive that you can see drastic differences in performance on the AMD chips simply by getting faster main memory modules. Overclocking your memory yields even more improvement - But that's the thing. All of these solutions are budget oriented, so they'll be saddled with slow and cheap memory to begin with.

You know the xbox? It's got shared memory. How to they get fast performance? ALL of the system's main memory hangs off the GPU. It's ALL GDDR3. They can do this sort of unified memory arch because it's a special custom designed system.

Until the memory bandwidth issue is solved the integrated GPUs will continue to be crap.

Re:Graphic Capabilities (1)

Nimey (114278) | about a year and a half ago | (#41308197)

You could use integrated GPUs for vector/matrix math, something they're a lot better at than the x86 core, thus greatly increasing the efficiency of certain workloads.

You don't have to use them only for making pretty pictures.

Re:Graphic Capabilities (1)

Ostracus (1354233) | about a year and a half ago | (#41309065)

Correct and GPU/CPU combinations also gain the next fastest memory tier below "the register" instead of going off die like even discrete GPUs have to do for some workloads.

Re:Graphic Capabilities (3, Informative)

bemymonkey (1244086) | about a year and a half ago | (#41309661)

"The integrated graphics are still crap."

Depends what for, really... Office, web and HD video? Nope, they're pretty good at that - so good, in fact, that I don't buy machines with dedicated graphics cards unless I'm planning on playing games or running applications that specifically require a fast GPU.

Even the HD3000 or HD4000 (Sandy and Ivy Bridge, respectively) graphics included with the last and current generations of Intel Core iX CPUs are overkill for most people - even a 4500MHD (Core 2 Duo 2nd generation) had perfect support for 1080p acceleration and ran Windows 7 at full tilt with all the bells and whistles, if you wanted those. What more do you want from integrated graphics?

The fact that I can even play Starcraft II on low at 1080p on a Core i3 with the integrated HD3000 at acceptable framerates is just icing on the cake...

Oh and have I mentioned the sub-5W total system power consumption on a 15.6" laptop with a standard voltage CPU? THAT is what integrated graphics are for. If you're looking to do gaming or CAD or use the GPU for computationally intensive tasks, you're not in the target audience...

Re:Graphic Capabilities (4, Informative)

Solandri (704621) | about a year and a half ago | (#41310921)

My 2-year old laptop has an nVidia GT 330M [notebookcheck.net] . At the time it was a mid-range dedicated mobile 3D video card.

Ivy Bridge's HD4000 comes very close to matching its performance [notebookcheck.net] while burning a helluva lot less power. So the delta between mid-grade dedicated video and integrated video performance is down to a little over 2 years now. Intel claims Haswell's 3D video is twice as fast as HD4000. If true, that would put it near the performance of the GT 640M, and lower the delta to a bit over 1 year.

This is all the more impressive if you remember that integrated video is hobbled by having to mooch off of system memory. If there were some way to give the HD4000 dedicated VRAM, then you'd have a fairer apples to apples comparison of just how good the chipset's engineering and design are compared to the dedicated offerings of nVidia and AMD.

I used to be a hardcore gamer in my youth, but life and work have caught up and I only game casually now. If Haswell pans out, its integrated 3D should be plenty enough for my needs. It may be "crap" to the hardcore gamer, but they haven't figured out yet that in the grand scheme of things, being able to play video games with all the graphics on max is a pretty low priority.

Re:Graphic Capabilities (1)

drinkypoo (153816) | about a year and a half ago | (#41311043)

have I mentioned the sub-5W total system power consumption on a 15.6" laptop with a standard voltage CPU?

Obviously at least an LED backlight, and probably turned way down. Or what, is it OLED?

Re:Graphic Capabilities (2)

bemymonkey (1244086) | about a year and a half ago | (#41311381)

Laptop displays have been LED backlit for years now - you can't buy a CCFL backlit display except maybe as a standalone monitor in the clearance aisle of your local big box electronics store...

As for AMOLED... that's useless as a laptop display, because it uses 2-5x as much power as a decently efficient LED backlit display when displaying mainly-white content (such as Slashdot or other websites) - not to mention the fact that AMOLED displays at this size (15.6" diagonal in this case, but consider this sentence to be true for roughly 10" diagonals and higher) cost thousands of $. I have had AMOLED displays on my last two smartphones, so I can definitely attest to the fact that the battery life sucks... on the other hand, the high contrast is pretty good, and you can conserve your battery life by using an operating system and apps with a lot of dark colors - hence why Android's UI has gotten darker and darker after their dev phones started being released with AMOLED screens ;)

So yeah, the laptop display is LED backlit. Just like every other laptop for the past 2 years or so :p

5W is with the display on roughly half or even 70% brightness, BTW - albeit using the low-res Chimei Innolux WXGA (1366x768) panel that came with the laptop... that thing was wicked efficient and very nicely readable in sunlight because of the huge pixels that made the panel almost transflective. I've since upgraded to a much less efficient panel (the FullHD 95% gamut panel used in the Thinkpad W520/530), and that still runs around 5.6-5.9W at half brightness :)

When your requirements grow (1)

tepples (727027) | about a year and a half ago | (#41314629)

Depends what for, really... Office, web and HD video? Nope, they're pretty good at that - so good, in fact, that I don't buy machines with dedicated graphics cards unless I'm planning on playing games

So if someone buys a laptop for "Office, web and HD video" and later decides to try games, what should he do? Buy another computer? Whatever happened to buying something that will grow with your requirements?

Re:When your requirements grow (1)

bemymonkey (1244086) | about a year and a half ago | (#41315057)

The problem is that buying a laptop (or even desktop - although the problems are usually more pronounced on laptops) with a high-powered graphics card has very negative side-effects:

1. More heat - Fan noise, uncomfortable heat during use, significan reduction in longevity (ever seen a non-plastic-POS Intel based laptop without dedicated graphics overheat? I haven't...)
2. Higher power consumption - the most efficient laptops with dedicated non-switchable graphics draw upwards of 10W idle... many draw 15 or 20W. In comparison, a modern Intel Core i-X based laptop using onboard graphics will draw 4-7W. That's a 2x to 5x increase in battery life...
3. Less driver issues for the non-gaming folk. Intel's drivers suck for gaming, but for basic OS support, they're great... Linux seems to support them all rather well, and even Pentium M generation graphics are still supported in Windows 7.

Laptops with switchable graphics are close to perfect, since they give you the best of both worlds, but if you use them with multiple monitors the dedicated graphics turn on automatically because the external ports are usually hardwired to the dedicated card... Machines with integrated-only stay cool even with two or three external displays attached.

Re:Graphic Capabilities (1)

bWareiWare.co.uk (660144) | about a year and a half ago | (#41310721)

GDDR3 is an optimized type of DDR2 memory. The 700Mhz in the Xbox360 it is less than half the speed of the DDR3-800Mhz stuff used by Atom's for the last couple of years. Even if you fit them with 4 times the memory they can't get close to the 360's graphics performance?
In the context of Haswell you are talking about an entry level of dual-channel DDR3-1600Mhz or around 25Gb/s beating the GDDR5 in bleeding edge top of the line discreet cards from just 4 years ago.

Re:Graphic Capabilities (1)

edxwelch (600979) | about a year and a half ago | (#41312577)

Not true. Firstly, the memory channel to the iGPU is somewhat more sophisticated than just tacking on the main memory bus. Secondly, the iGPU has much less processing power than a top end dGPU, therefore it need much less memory bandwidth. Increasing bandwidth would be of no value.
iGPUs are mainly for budget laptops, where there is going to be no dGPU installed. Ivy bridge and Trinty iGPUs are powerful enough to run Crysis (on low settings). Something not to be scoffed at, considing low budget buyers never had chance to run games like this previously.

Re:Graphic Capabilities (0)

Anonymous Coward | about a year and a half ago | (#41307497)

traditionally they have been abysmal

Performance has increased all around, so it seems like more. It's still abysmal though. The "nicest" thing about modern integrated graphics comes from advances in GPU architecture having plateaued. Integrated graphics don't lack whatever a game needs to run, like they used to. They just perform worse.

So, using integrated graphics today is possible. Traditionally, games would refuse to run at all.

Re:Graphic Capabilities (4, Insightful)

DigiShaman (671371) | about a year and a half ago | (#41308079)

It's official. Intel on-board video is all you'll ever need for home and general office use. I'm not talking about "getting by" performance. I'm talking about full 2D hyper smooth animation, fading, and alpha blending. I've seen both Ubuntu, Windows 7, and Windows 8 using on-board. It doesn't bat an eyelash. No pausing or hiccups. Flash animation couldn't be more smooth too.

About the only reason you wouldn't use on-board video is if you must run Adobe product that calls for GPU acceleration or you're a gamer.

nVidia knows this too. As you can see, they've been focusing in on advanced 3D gaming and super computing. It won't be long before nVidia turns into the next "SGI" where only high-end is their focus. To the point where even they start losing the consumer market and only focus on business-to-business solutions and other vertical market applications.

Re:Graphic Capabilities (1)

Narishma (822073) | about a year and a half ago | (#41308897)

nVidia knows this too. As you can see, they've been focusing in on advanced 3D gaming and super computing.

And mobile stuff (Tegra) where they have their own (licensed) CPU and GPU.

Re:Graphic Capabilities (-1)

Anonymous Coward | about a year and a half ago | (#41309701)

http://www.rido.vn

Re:Graphic Capabilities (1)

fa2k (881632) | about a year and a half ago | (#41310687)

It's official. Intel on-board video is all you'll ever need for home and general office use.

Agreed, but don't confuse this with "you should recommend integrated graphics to home users", though. Your examples are perfectly tuned for Intel graphics because many people have them. For a business, that's fine, you only need a fixed set of applications. For a home user, it's likely to be some flash game, Google Earth or some software that works much better with a dedicated graphics card. Good ones are quite cheap now, and if your looking at a "i5", spending some extra on a GPU gives more bang for the bukc than getting a faster CPU. For laptops, it's up to the individual user if they want battery power or performance.

Re:Graphic Capabilities (1)

Lennie (16154) | about a year and a half ago | (#41314873)

So what Intel has is "good enough" for 99.9% of the users, but AMD delivers the same thing but for less money ?

Re:Graphic Capabilities (1)

immaterial (1520413) | about a year and a half ago | (#41307255)

Useful for a laptop maybe

Hmm, I wonder where these ultra low-power chips are intended to go...

Re:Graphic Capabilities (1)

Hsien-Ko (1090623) | about a year and a half ago | (#41307393)

Maybe in something alluding from some sort of a VaporCrate some people want?

Re:Graphic Capabilities (0)

Anonymous Coward | about a year and a half ago | (#41308543)

Surely you mean a SmokeBucket?

Re:Graphic Capabilities (2)

Hadlock (143607) | about a year and a half ago | (#41307363)

Laptops make up something like 50% of the consumer market. Integrated graphics are what go in most dells for corporate users. An HD4000 has no problem pushing a dual or triple screen setup. The triple head displays at my work choke on anything more than word processing. Dragging a youtube video across all three makes things very choppy. Also, the HD4000 is an actually usable chipset. It's nothing like the old integrated graphics of old like the GMA950 which couldn't even load TF2. HD4000 will do TF2 at 250-300fps.

Re:Graphic Capabilities (0)

Anonymous Coward | about a year and a half ago | (#41307543)

> HD4000 will do TF2 at 250-300fps.

Anything can do 300 fps on TF2. That's like benchmarking Minesweeper.

Re:Graphic Capabilities (1)

Anonymous Coward | about a year and a half ago | (#41308479)

Actually I benchmark most machines by seeing how fast the cards bounce after winning a game of Solitaire, you incensitive clod.

Re:Graphic Capabilities (0)

Anonymous Coward | about a year and a half ago | (#41309913)

Wait, I thought TF2 stood for Team Fortress 2.

Anyway, my 2010 model (core i5-460m) laptop can do Team Fortress 2 at 1368x768 at around 30 FPS. It's not silky smooth but it is very playable. And this is a relatively old card. I can believe you can do some gaming on the HD4000.

However, the also built in ati 5650 card just smokes it. Solid 60 with 2xAA.

Re:Graphic Capabilities (1)

Bengie (1121981) | about a year and a half ago | (#41311009)

The core i5-460m is a non-IvyBridge, which is about twice as fast as yours, and the Haswell version is faster than IB's IGP.

Re:Graphic Capabilities (1)

atlasdropperofworlds (888683) | about a year and a half ago | (#41307591)

Actually, I find the integrated GPU interesting - not for graphics, but for additional GPGPU power. Those things are fully OpenCL/DX11.1 compliant, so you can probably run some fluid simulation or n-body on them while at the same time doing some different crunching on the CPU, all being rendered extra pretty by a powerful discrete GPU.

Re:Graphic Capabilities (1)

LWATCDR (28044) | about a year and a half ago | (#41307757)

You do know that laptops outsell desktops. As for real stuff if you mean work then boy are you wrong. For anything outside of the sciences, CAD, CAM, Video, and Audio production, these will work very well. For all the home users that run Quicken and go on the web to use Facebook and such then this will do very well for them.
If these chips can get good enough performance on a 1080 monitor then they will be a giant boon for gaming. Most people use a single 1920x1080 monitor if this allows for a lot of games to look good enough then more games will sell. Of course just as today you will have the high end video cards for "enthisasts" and for those that need more graphical power but for many users this maybe good enough.
   

Re:Graphic Capabilities (1)

viperidaenz (2515578) | about a year and a half ago | (#41307805)

Why do you need an actual graphics card to provide hardware acceleration to your OS windows and web browsing?
Why can't you have the integrated graphics render most things, and your games/cad software using a discrete card when they need it?

NVIDIA Optimus finally coming to Linux (1)

tepples (727027) | about a year and a half ago | (#41314707)

Why can't you have the integrated graphics render most things, and your games/cad software using a discrete card when they need it?

Because until a couple weeks ago [phoronix.com] , NVIDIA refused to make that technique (which it calls Optimus [wikipedia.org] ) possible on a GNU/Linux operating system.

Re:Graphic Capabilities (0)

Anonymous Coward | about a year and a half ago | (#41308289)

Who cares about that graphics part inside the CPU.

Let's see, who might care. Companies who use or manufacture Laptops, Netbooks, Smartphones, Tablets, or even hand-held gaming systems. I'll admit this is a very small demographic consisting mainly of nerds but, then again, this is a nerd news site.

Also, there are many other reasons to find this interesting:
People who are generally interested in and follow the progression of microprocessor technology.
People who care about open source drivers to the point where they won't use ATI or NVidia on their personal machines (me).
People who use ATI or NVidia and like to play games as well as look for and comment on stories about Intel graphics (you).

Indeed, I'm starting to believe you would have a hard time finding someone who doesn't care.

Re:Graphic Capabilities (1)

aaron552 (1621603) | about a year and a half ago | (#41314449)

People who care about open source drivers to the point where they won't use ATI or NVidia on their personal machines (me).

Are you aware that there are open-source drivers (ATi ones even have 3D) for ATi and nVidia?

While I can understand avoiding nVidia if you don't want to install a proprietary graphics stack, why avoid ATi/AMD when there's serviceable open-source drivers for all but the latest cards?

Re:Graphic Capabilities (1)

drinkypoo (153816) | about a year and a half ago | (#41311035)

Who cares about that graphics part inside the CPU. Useful for a laptop maybe, but for the real stuff you need an actual graphics card.

I have seen it asserted a couple times now that the current intel integrated graphics are acceptable for light gaming. If the new stuff is twice as powerful (I'm confused by the summary but don't care enough to RTFA as I have no plans for new machines in that class any time soon) then it will be entirely useful for everyone but gamers who must play the latest and greatest at high resolution.

Hot Town! Summer in the City Just Got A Whole Lot (-1)

Anonymous Coward | about a year and a half ago | (#41307113)

Cooler! Still can feel the back of my neck all dirty and gritty!

Incorrect summary is incorrect (3, Informative)

Anonymous Coward | about a year and a half ago | (#41307123)

Intel's Statement was that it could produce similar results as Ivy Bridge at half the power consumption OR around twice the power at the same power consumption as Ivy Bridge's built in chip.

Which is still pretty good all considered.

Re:Incorrect summary is incorrect (1)

bemymonkey (1244086) | about a year and a half ago | (#41309693)

Sounds awesome to me... I'll take half the power consumption, thanks. I wonder if that goes for total idle power consumption... I'm already seeing less than 5W idle (depending on which LCD panel is installed - add a Watt for the enormously inefficient AUOptronics 1080p panel Lenovo uses) on my Sandy Bridge laptop (and that power consumption includes the SSD), so Haswell should hopefully be able to drop that to 3-4W... hopefully that'll also average out to ~2W less in actual use - meaning a 94Wh battery would net 16 hours instead of the ~12 I'm seeing now (I average about 7.5-8W in regular office/web usage with the 1080p panel)...

Now if only Lenovo would kill off the chiclet keyboards so I can move on from Sandy Bridge... otherwise I'll be stuck with this laptop for the next 10-20 years :(

Re:Incorrect summary is incorrect (1)

mobby_6kl (668092) | about a year and a half ago | (#41316059)

Haha, what do you have, a T520? I was excited about the T530 until I saw the keyboard. It's not even the chiclet thing, it's the missing row and consequently fucked up layout!

Aaanyway, I've been getting extremely impressive battery life out of the Sandy based laptops, so the future looks bright :)

Re:Incorrect summary is incorrect (0)

Anonymous Coward | about a year and a half ago | (#41319339)

Sounds awesome to me... I'll take half the power consumption, thanks. I wonder if that goes for total idle power consumption...

That 2x performance at 1x power or 1x performance at 0.5x power is in reference to the Haswell integrated GPU in full active mode. When it comes to idle power, Intel is claiming a 20x (or better) improvement in Haswell.

Sandy / Ivy Bridge sat in S0 "active" state while idling. Although they could reduce power quite a bit in S0, it was still several watts. Haswell introduces a new S0ix "active idle" state which powers down most of the chip (power consumption in S0ix is just barely higher than true S3/S4 sleep states). The key advance is that (unlike S3/S4) transitions between S0 and S0ix are extremely fast, so much so that Intel just made them automatic and transparent to the OS, which thinks the CPU's awake all the time.

In other words, nearly-asleep is the new Haswell idle. If the transitions are truly that fast, for the sort of usage model you're talking about, your 94 Wh battery will mostly become a flashlight battery for your notebook's backlight.

I love to fart. (-1, Offtopic)

Anonymous Coward | about a year and a half ago | (#41307145)

If I could no longer fart, my life would be empty and meaningless.

Oh, wait...

Selling function when all one can see is the form. (1)

Ostracus (1354233) | about a year and a half ago | (#41307203)

Welcome to the world of the supersmall. As real as software, and just as hard to impress when going, "see this".

Re:Selling function when all one can see is the fo (-1)

Anonymous Coward | about a year and a half ago | (#41307753)

"just as hard to impress when going, "see this"."

Like when you whip out that Vienna sausage you call a dick

Jews did WTC (-1)

Anonymous Coward | about a year and a half ago | (#41307275)

You know it's true. Intel is run by jews. Every intel chip you buy supports jewish world domination.

Good direction (1)

Anonymous Coward | about a year and a half ago | (#41307473)

Intel has laid its share of rotten eggs, but for the past few years they seem to "get it" relative to the technology market. Consumers want lower power consumption, small form factor, and hardware acceleration for mobile access to Internet services. Companies want higher core density per physical chip, lower power consumption, and virtualization to better deliver services to the Internet. If Intel delivers the best products for each segment of that ecosystem, they have a bright future ahead of them.

i7 powered coke machine (1)

Anonymous Coward | about a year and a half ago | (#41307645)

I accidentally went into the article and near the bottom they mention an i7 powered coke machine. Now that's bloat.

Re:i7 powered coke machine (0)

Anonymous Coward | about a year and a half ago | (#41311681)

It's not like you would buy a processor for a coke machine off newegg. If you go with Intel instead of some other embedded processor, and for many units, I'm sure they'd cut you a deal. Your programmers will be cheaper too, and software is the cost that keeps costing.

C++ on ARM vs. C++ on x86 (1)

tepples (727027) | about a year and a half ago | (#41314747)

If you go with Intel instead of some other embedded processor [for a vending machine], and for many units, I'm sure they'd cut you a deal. Your programmers will be cheaper too

How exactly are C++ programmers for ARM on something like a Raspberry Pi board cheaper than C++ programmers for x86 platforms?

Re:i7 powered coke machine (1)

GungaDan (195739) | about a year and a half ago | (#41311893)

That's just the code name for the Charlie Sheen bot they've got in skunkworks.

Upgrade path (1)

PrinceBrightstar (757413) | about a year and a half ago | (#41307917)

It looks like based on what we're seeing from intel's plan for Haswell, the upgrade path for those on SandyBridge-E is going to be Xeon going forward.

How many MIPS, how many MFLOPS? (0)

Anonymous Coward | about a year and a half ago | (#41308613)

So how many mips? How many mflops can it do? IBM's $77000 mainframe can do 29 MIPS. Their top of the line does like 780 MIPS.

Can it do more than that???? Go on, can it beat a computer costing millions of dollars, that runs at 5GHZ!!

Didn't showcase Itanic III? (1)

unixisc (2429386) | about a year and a half ago | (#41309331)

I'm surprised - reading TFA about the launch, they said nothing about the Itanium III? Have they given up on HP?

Still not good enough for me(just my opinion) (2)

csoh (45909) | about a year and a half ago | (#41309657)

What I want for my ultimate mobile computing device:

1. Small, lightweight and have physical keyboard
  I walk a lot so I want small device that fit comfortably in my backpack (so that's below 7'') and weight less than 1.5(preferrably 1) pound. I'm not all-day mobile warrior so I can live with cramped keyboard but after testing my wife's galaxy s2 touch keyboard I decided I DO NEED a physical keyboard for typing documents/playing games(like nethack, old dosbox compatible games).

2. MS application/IE compatiblity
    I need to do business with MS office documents and MS IE only internet banking/payment processing. Libreoffice is not good enough if you have to edit/exchange MS office documents with other business entity(and that stupid and powerful entity is stubborn enough that it want genuine MS office docs only and complain slightest of incompatibility problems)

3. Very low power
10W - It will still need fan or huge heatsink. Moving part/high power is not good for longivity/ruggedness let alone battery life. My estimate is that you'll have to go below 2W to acheive compact & sleek design without fan/huge heatsink - Yes atom Z5XX do that and I have one now.

4. usable graphic core without fsckup.
I need graphic core that supports linux well and play angry bird. PVR core in atom don't support either. Even their xp driver don't support basic opengl well enough.

5. Support basic net tools/secure net connection I feel comfortable
I want to redirect all normal net connection via VPN using my secure home base using openvpn when I connect to untrusted/public wifi. I believe that is reasonably achievable(without heavy source modifying/manually recompliling) with only linux/winxp~7 for now. And I hate OS that don't support basic net tools.

6. Trusted application that I know What it is doing.
I don't want application that does unknown things behind my back(leaking private info for whatever reason or doing net connection I don't want it to do). So I prefer well known/open source apps and become skeptical on many android/google apps.

If you go ultrabook route, you can acheive 2,4,5,6 for now.
If you go atom route, you can achieve 1,2,3,5,6 for now. Currently I've settled for this.
If you go arm based smartphone/pad route, you can achive 1(depends on device),3,4,may be 5 (if you rooted your phone/pad) for now.
With WINRT device, may be you'll be able to achieve 1,2,3,4.

Of course things are changing so somewhere in future may be you could do things with a platform that counldn't do for now(compatibilty/standard compliance got better,intel finally make 2w non-atom processor/drop FSCKING pvr core from atom, better performance to run emulation comfortably, corporation changes their mind about privacy...). So I think it is the race between platforms which acheives the most within reasonable time.

Wow - must remember bay trail (1)

csoh (45909) | about a year and a half ago | (#41309961)

Happy to know that bay trail platform finally drops PVR graphics core. Hope that some manufacturer produces small factor platform that I want in 2013.

GMA in Atom netbooks (1)

tepples (727027) | about a year and a half ago | (#41314805)

MS IE only internet banking

Other banks exist.

I need graphic core that supports linux well and play angry bird. PVR core in atom don't support either.

Since when are PC makers still using GMA 500 (the PowerVR core) in new Atom netbooks? I thought they had all switched to four-digit GMAs, which have working drivers in Ubuntu.

Remember the GMA500 (2)

jbernardo (1014507) | about a year and a half ago | (#41310173)

As any other owner of that orphaned Intel chipset, I'll never buy another Intel integrated video solution. Even if they manage to get their power consumption below competitive ARM SoC, I will still not get that crap. The GMA500 disaster showed how much Intel cares for end users after selling them the hardware. So it is interesting they managed to reduce power consumption so much, but my netbooks are still going to be AMD, my tablets and phones are ARM possibly with NVidia's Tegra chipset. Intel will have to do a lot more to convince me to try their solutions again.

Re:Remember the GMA500 (2, Informative)

Anonymous Coward | about a year and a half ago | (#41310425)

The GMA500 was for embedded devices anyway, and not a real Intel chipset. Intel knows of the problem is actively working on replacing those PowerVR chips with their own chips. ARM chips have the same or even worse problems than GMA500 chips: You don't have working drivers for those either, maybe some for Android, but not for Xorg.

Re:Remember the GMA500 (3, Informative)

Kjella (173770) | about a year and a half ago | (#41312739)

The GMA500 disaster showed how much Intel cares for end users after selling them the hardware.

GMA500 = rebranded PowerVR SGX 535. The graphics Intel develops themselves isn't for serious gamers but it's improved leaps and bounds over the last couple years. You're of course free to be unhappy about the Poulsbo and with good reason, but most people with a recent Intel IGP are very happy and the sales of discrete cards only goes one way, down.

Re:Remember the GMA500 (1)

Narishma (822073) | about a year and a half ago | (#41313015)

The GMA500 is not an Intel chipset. It's a rebranded PowerVR SGX something or other.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...