Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Unveils 'Sandy Bridge' Architecture

Soulskill posted more than 3 years ago | from the better-faster-something-something dept.

Intel 163

njkobie writes "Intel has officially unveiled Sandy Bridge, its latest platform architecture, at the first day of IDF in San Francisco. The platform is the successor to the Nehalem/Westmere architecture and integrates graphics directly onto the CPU die. It also upgrades the Turbo Mode already seen in Core i5 and i7 processors to achieve even greater speed improvements. Turbo Mode on Sandy Bridge processors can now draw more than the chip's nominal TDP where the system is cool enough to do so safely, enabling even greater boosts in core speeds than those seen in Westmere. No details of specific products have been made available, but Intel has confirmed that processors built on the new architecture will be referred to as 'second generation Core processors,' and are expected to go on sale in early 2011. In 2012 it is due to be shrunk to a 22nm process, under the name Ivy Bridge."

cancel ×

163 comments

Sorry! There are no comments related to the filter you selected.

I have first-ed this article... (0)

Anonymous Coward | more than 3 years ago | (#33569450)

pray I don't first anymore.

From the article:

"While the results were impressive, it was noticeable that as the workload on the graphics increased, the discrete processor pulled ahead in the framerates substantially. We’ll look forward to testing the new processors on our own benchmarks to see just how good the new integrated graphics really are."

So I guess discrete chips will still be around for a while...

I haven't been following AMD/ATI, what have they been up to in this area?

Re:I have first-ed this article... (4, Interesting)

DurendalMac (736637) | more than 3 years ago | (#33569656)

The graphics have already been benched. Anand had early samples and showed that the Intel integrated SB video was actually faster than a Radeon 5450 in most cases. Yeah, that's not great, but for integrated graphics that's pretty damned impressive.

Re:I have first-ed this article... (1)

tepples (727027) | more than 3 years ago | (#33569854)

Anand had early samples and showed that the Intel integrated SB video was actually faster than a Radeon 5450 in most cases.

To put this in context for someone switching from console to PC gaming, is this equivalent to a Wii's Hollywood GPU, equivalent to an Xbox 360's Xenos GPU, or somewhere in between?

Re:I have first-ed this article... (1)

petermgreen (876956) | more than 3 years ago | (#33570924)

My question would be are those wanting a laptop that is too small for a discrete GPU and who care about graphics performance going to be better off staying with a core 2 duo with a nvidia chipset? or will they be better off with sandy bridge?

Unfortunately anandtech ( http://www.anandtech.com/show/3871/the-sandy-bridge-preview-three-wins-in-a-row/7 [anandtech.com] ) didn't include nvidia integrated graphics in their comparison. Also they were using a desktop not a laptop chip afaict (though they don't seem to know for sure exactly what they had from reading the comments). Integrated graphics performance is far less relevant for desktops than for laptops since it's so easy to add dedicated graphics to a desktop.

Further it seems anandtech can't seem to be consistent in their benchmarking. I found an article where they benchmarked a 320M ( http://www.anandtech.com/show/3762/apples-13inch-macbook-pro-early-2010-reviewed-shaking-the-cpugpu-balance/2 [anandtech.com] ) but WOW was at lower settings and the other games didn't match up at all. Still told me that sandy bridge was probablly better than the 320M at least for WOW.

Re:I have first-ed this article... (5, Interesting)

hairyfeet (841228) | more than 3 years ago | (#33569678)

AMD has a new arch coming out which will go by Bulldozer [anandtech.com] for the mainstream and Bobcat for the low end netbook market. It looks to be pretty bad ass, as their hyperthreading will have an actual integer per thread with only the floating point being shared. Last I heard they were using the 5450 Radeon GPU for their integrated so it will definitely pump out the graphics. Next year should be an interesting time for us builders.

Sadly I'll probably sit this round out as my AMD quad is already faster than I am and at 8Gb of RAM I don't see myself needing anything else for quite some time, but Bulldozer based rigs should be great and affordable for my customers and if Bobcat rocks as well as their AMD Neo + Radeon discrete it'll make a kick ass multimedia netbook chip. According to those I've sold Neo dual based netbooks to they are getting around 5 hours on a charge and the graphics and video performance is awesome, and Bobcat is supposed to cut the power by anywhere from 40-60%.

So it looks like either way you go next year is gonna be a nice time for new gear. Faster and better graphics, cool. BTW, does anyone know if the Intel will support some sort of hybrid SLI? The AMD allows you to put a low end discrete with the onboard and bring it up to midrange GPU performance. Of course the way Intel has been trying to hamstring Nvidia lately it wouldn't surprise me if it don't. You'd think Intel would just accept they suck at GPUs and buy Nvidia already. And if Nvidia doesn't end up buying Via so they can offer an "all in one" solution like Intel and AMD I predict it's gonna be some bad times for them, with Intel trying to squeeze them out of their sandbox and AMD not needing them since buying ATI.

Re:I have first-ed this article... (1)

Keen Anthony (762006) | more than 3 years ago | (#33570066)

I've been out of the PC building rat race for several years now, and I'm diving back in. I don't know what AMD and ATI have to offer because Intel and NVIDIA are getting the stars with technologies like Turbo Mode, SLI, and low heat dissipation in the i7. All of I've been reading about with the new Geforce GT and GTS, has me very excited for all the graphics power I'll have, although the lack of support for Starcraft II on my 7600 GT based iMac has me pissed. Do you think AMD and ATI have something worthwhile now for me to consider, or do you think this next generation is enough that I should wait til 2011?

Re:I have first-ed this article... (1)

Manatra (948767) | more than 3 years ago | (#33570202)

I would suggest checking out ATI, for the last year and a half or so nVidia has been playing catch up to ATI, especially when it comes to the mainstream graphics cards. While Fermi is a decent video card it's about the same as ATIs 5xxx series cards in terms of performance only Fermi puts out more heat and take in more power. Right now the only card/price point that is considered solidly in nVidia's favour is the GTX 460. Coincidentally, upcoming Southern Islands series (or is it Northern Islands, the names keep changing) from ATI's first videocard will be targeted at the same price point as the GTX 460. Southern Islands should be releasing in the next month or two. I don't know what nVidia has in store really, but we'll see what happens. Both companies road maps got uppended when TSMC (their manufacturer) decided to suddenly dump the 32nm production process and go straight to 28nm. The www.anandtech.com is a pretty good place to do your research. Especially the video card forums, just try to avoid the flame wars in there :p

Re:I have first-ed this article... (1)

Keen Anthony (762006) | more than 3 years ago | (#33570270)

Thanks for the encouragement. So far, I'm leaning on a laptop since I don't have space to house an iMac plus PC monitor, and in the laptop end, my options for NVIDIA seem limited to the Geforce GT 330M. I'm sure I can get a lower end GT 400 series NVIDIA in a laptop, but I'm guessing that would require I buy an Alienware or some other beastly looking laptop. I want sleek and Sony'ish like a proper laptop should be, so that's that. I haven't seen an ATI Radeon 5870 in normal looking laptop. Of course, I have been leaning heavily towards the i7 with its lower consumption and increased power. Being future proofed for the next generation of Diablo and Starcraft games is important to me, as is running Sim-anything at full power :D

Re:I have first-ed this article... (1)

Manatra (948767) | more than 3 years ago | (#33570460)

You don't need to go Alienware. I am typing this on the Asus g73jh which I bought for $1500 + tax. Here is a review: http://www.anandtech.com/show/3662/asus-g73jha2-affordable-xlsized-gaming [anandtech.com] The laptop is a few months old so it's cheaper now, I've seen some places selling it for $1000. That said, there's a version coming out soon with an nVidia graphics cards. I would recommend getting those since a large amount of the ones with Radeon 5870s have grey screen of death issues if you update the drivers. There are solutions, but they're a little weird. My solution to the grey screen of death was to install the 10.8 Catalyst drivers, overclock my videocard by 5 MHz, and then download a still in beta hotfix from Microsoft that had something to do with the framebuffer causing freezes. Come to think of it I don't even think I have to overclock my computer since downloading the update anymore but I digress. The Asus g73jh and its variants are some of the best bang for your buck gaming laptops around. In short, the screen on the laptop is good and it gets great performance. The only downside is that some have the GSOD, but since new laptops have nVidia cards that should be solved. The laptop is very large so it's more of a portable desktop but it will definitely fit into smaller space than an iMac + PC monitor.

Re:I have first-ed this article... (1)

Keen Anthony (762006) | more than 3 years ago | (#33570514)

Awesome, thanks a lot! I'm surprised to learn that the GT 330M is still a DirectX 10 processor whereas the current ATI is DirectX 11.

Re:I have first-ed this article... (2, Interesting)

hairyfeet (841228) | more than 3 years ago | (#33570230)

Well I would say whether you are building this machine as an "ePeen" or not. By that I mean I have a couple of customers who spend frankly insane money just so they can brag they get some huge number on benchmarks. Now if you aren't building for an ePeen, I'd say go AMD as the lower price will allow you to put nicer gear in the rig. For example my PC is an AMD Phenom II Quad 925, with 8Gb of DDR2 800, 2 500Gb HDDs, a nice case with 500w PSU, and finally an HD 4650 (I'm not much of a gamer, so the 4650 is all I need, although it plays Wolfenstein and Bioshock II like a champ) and Windows 7 HP x64, all for $650 before MIR and around $570 after.

If you don't want to wait you can buy AMD now and thanks to socket compatibility drop in a bigger CPU later. That's what I did, buying a cheap dual core kit and upgrading to the quad with some of my Xmas bonus. You can see they have real cheap quad kits [tigerdirect.com] and you can even get a 6 core kit for under $600 [tigerdirect.com] . All you have to do is pick your favorite OS and whichever video card you like (I'm partial to ATI after the bumpgate fiasco, never had a bit of trouble from Gigabyte Radeon cards) and you are good to go. Most games now are just starting to hit dual cores, so a quad will last you quite awhile and a 6 core will be pretty future proof.

So if you are just getting back into the game personally I'd go AMD. There is nothing wrong with the Intel but when you figure in the higher prices plus them getting caught paying off OEMs...well I believe in competition and a REAL free market. But of course your main reason will be performance and my AMD quad purrs like a kitten, with an idle of less than 96f and the hottest it ever got was 135f after hours of transcoding, and that is on a stock HSF. But I really torture my machines, audio and video recording and transcoding, audio and video editing, and my AMD takes everything I throw at it and then some. I can also tell you as a system builder I've not had a bit of trouble from any of my AMD builds, even those I have built for extreme conditions like construction trailers. They take a licking and keep on ticking. Let me know how it goes!

Oh, a word of advice...Once your build is done use Autopatcher [autopatcher.com] and Ninite [ninite.com] . Just use Autopatcher to have the updates for whichever Windows you choose already downloaded and ready to go, and then use Ninite to get all the basics like Firefox, K-Lite Codec Pack (great for hardware acceleration) and Open Office. Using those two together will save you several hours on a build. Enjoy!

Re:I have first-ed this article... (1)

Keen Anthony (762006) | more than 3 years ago | (#33570326)

Thanks. As I just told one slashdotter, I've been leaning toward a laptop since maintaining an iMac (which I use for development and work) on my desk will be difficult if I have to add a new PC monitor. :D

I've always loved AMDs, but even being out of the game, I've still heard major news coming out from Intel. AMD always seemed silent to me. Basically, my story is that I had had enough with my Pentium IV PCs leaving my computer room frying in winter, so I became more conscious to issues like heat dissipation and noise. I decided to go the iMac plus game console route, which proved to be a money and energy saver for me. I ended up staying Mac based through the Intel transition. Lessons learned: when you go Mac + game console, you save on chasing hardware upgrades, but you pay for it down the road by ending up years behind your fellow computer geeks on knowledge.

Thanks for those suggestions. Ninite will definitely save me time!

Re:I have first-ed this article... (1)

hairyfeet (841228) | more than 3 years ago | (#33570574)

Well why didn't you say so? Let the old Hairyfeet point you in the right direction. Now if you want a bad ass laptop I'd go with this one [yahoo.com] as I've had good luck with Acer and this one is top notch on CPU and GPU. Or if you are looking for something light and cheap I'd go with this one [newegg.com] which I've sold quite a few of and my customers love it. I liked it so much I got one for my dad, it has a great picture and the Radeon 3200 is great for video.

But whether you go desktop or laptop the money you save by going AMD will allow you to have better hardware all around. That said I've always been a desktop man at heart because I like being able to pop the hood and hot rod it. Have you thought about a KVM switch? I currently have 2 desktops running on a 4 port KVM that cost a whole $20 and when I had my laptop (gave it away to my oldest when he started college) I just plugged it in and used it with the desktop monitor. Having a nice big screen to work just makes everything that much nicer IMHO. But either way you'll save serious cash over the i series which you can then use for nicer gear. Have fun!

Re:I have first-ed this article... (1)

Keen Anthony (762006) | more than 3 years ago | (#33570644)

Hehe. That Acer does look good. I like that the 5650 in it is DirectX 11. The MSI looks good too, but I believe the 3200 is only DirectX 10. Of course, nothing I'm interested in that's here or on its way soon is DirectX 11, so perhaps DirectX 11 support early on is a bit unnecessary.

KVM switch won't work with my setup. The iMac is an all-in-one from '06, so any desktop monitor would need to share desk space with my iMac which I use for work.

Forget turbo mode, It's nearly 2011, the innovations we need are in the supply-chain assembly side. I want to be able to custom build an entire laptop, case included.

Re:I have first-ed this article... (0, Troll)

WilliamGeorge (816305) | more than 3 years ago | (#33570160)

I'm sure it is a typo, but if you really only have 8Gb (1GB) of RAM paired with a quad-core CPU then I feel bad for you :/

Re:I have first-ed this article... (0)

Anonymous Coward | more than 3 years ago | (#33570690)

As a gamer I agree - Intel sucks at meeting my needs. As far as making GPUs for the market, Intel dominates in volume, so clearly they don't 'suck' as far as the market is concerned (price).

Now when it comes to low end gaming, it seems that Intel is actually catching up. If AMD is targeting the Radeon 5450, then they are losing ground. As Anands latest data shows, SandyBridge IGP is about parity with the 5450.
http://www.anandtech.com/show/3871/the-sandy-bridge-preview-three-wins-in-a-row/7

That being said I am hopeful that AMDs Fusion/Neo future (Zacate/Ontario) provides decent performance at a netbook price. SandyBridge won't be cheap.

Intel buy nVidia? Replace Intel CEO Otellini? (3, Interesting)

Futurepower(R) (558542) | more than 3 years ago | (#33570870)

"You'd think Intel would just accept they suck at GPUs and buy Nvidia already."

Should Intel buy nVidia? Jen-Hsun Huang [wikipedia.org] , who averages about $23.02 million per year [forbes.com] , is not the sort of person who would easily integrate into Intel, and he is important to the leadership of nVidia. Intel's CEO, Paul Otellini [wikipedia.org] , makes about $14 million. [computerworld.com]

Soon Intel's integrated graphics will have mid-range speed, leaving only the high range for nVidia. The high range of video adapters is mostly bought by teenagers who want to practice being violent with video games, instead of practicing being involved with other people. That means nVidia will be dependent on buyers who are being self-defeating; eventually there may be a backlash against that.

The high range of video performance will always be needed for architectural drawing and machine design, for example, but the total demand will drop, as the nVidia stock price [wikinvest.com] seems to indicate. So, maybe nVidia is not a good purchase for any company.

Should Intel CEO Paul Otellini be replaced? Another reason Intel should not buy nVidia is that Intel is generally a failure at anything besides making new CPUs and support chips. For the success of Intel and AMD in making CPUs, the world can be extremely thankful; that's enough success for any company.

But Intel in other areas seems amazingly badly managed. Intel marketing seems completely out of control. Is the product confusion [dailytech.com] at Intel a deliberate, sneaky way to sell slow processors to technically challenged customers, or just stupid?

Quote from the article linked just above: "Sandy Bridge PC processors will keep the CORE-i3, i5, and i7 designations and will be rebranded the "new CORE-i3..." That approach is likely to create confusion among customers about exactly what they're buying, given that the average user likely wouldn't be able to pick a Nehalem i7 from a Westmere i7 or Sandy Bridge i7."

Either Intel's purchase of the inferior security software maker McAfee for a "lofty 60% premium" [wsj.com] is a HUGE mistake, or the reasons why it is not a mistake should be explained by Intel marketing. No explanation was given, apparently. McAfee has a 21.9% market share selling software often pre-loaded on a computer to technically challenged buyers.

Quote from the article: " 'We believe security will be most effective when enabled in hardware,' Intel Chief Executive Paul Otellini said in a conference call." That seems a particularly wacky statement. "Security software" is needed only because, in my opinion, Microsoft deliberately allows its software to be insecure. Insecure software makes Microsoft more money because people with infected computers often buy another computer. For example, see the New York Times article, Corrupted PC's Find New Home in the Dumpster. [nytimes.com] The Apple Mac OS, Linux, and BSD operating systems do not require "security software" because they are made to be secure.

Intel CEO Otellini does not seem to have the social sophistication necessary to running a big company. When he made an announcement [youtube.com] in 2006 about the Intel Eduwise laptop, he seemed to be intending to have Intel compete with MIT professor Nicholas Negroponte's One Laptop Per Child (OLPC) charity program. However, Intel's intention seems to be just to make a market for its Atom processor, which competes with the AMD Geode in the original OLPC computer. Otellini's entire involvement has been bad publicity for Intel. For example, see the January 9, 2008 BBC article Intel 'undermined' laptop project [bbc.co.uk] and the January 15, 2008 Inquirer article, Paul Otellini tells Intel staff about the OLPC affair. [theinquirer.net]

What do you think of the Intel's failed Larrabee [anandtech.com] project? The idea was to make a multi-core CPU that could also be used as a GPU. But Intel was not able to deliver the software to make the Larrabee chips work, and abandoned the Larrabee project [anandtech.com] , at least for now. Intel claiming it would compete with nVidia brought Intel a lot of bad publicity.

Why not just replace Mr. Otellini? Does he contribute something to Intel that isn't obvious?

Whatever you think of Mr. Otellini, in my opinion Intel certainly does not have the ability to integrate another company like nVidia.

Turbo Mode (4, Funny)

clinko (232501) | more than 3 years ago | (#33569452)

Old news. My 386 has turbo mode. Wake me when they add math coprocessors to this beast.

Re:Turbo Mode (1)

Bacon Bits (926911) | more than 3 years ago | (#33569544)

Don't we call those "graphics cards"?

Re:Turbo Mode (4, Insightful)

toastar (573882) | more than 3 years ago | (#33569580)

Don't we call those "graphics cards"?

Has Intel ever made a quality graphics coprocessor?

Re:Turbo Mode (0)

Anonymous Coward | more than 3 years ago | (#33569676)

yes but unfortunately they are on average 4 years too late so they look like ass

It looks like ass and still prints money (0)

tepples (727027) | more than 3 years ago | (#33569878)

yes but unfortunately they are on average 4 years too late so they look like ass

Looking like donkey compared to another product on the market never stopped Wii games from selling. Or what fundamental difference between the low-end PC gaming market and the low-end console gaming market am I missing?

Re:It looks like ass and still prints money (1)

petermgreen (876956) | more than 3 years ago | (#33571012)

With consoles each console has a distinct pool of games. Most games are either released for a single console or for a group of consoles with similar capbilities. Sometimes with a PC release as well but the PC release is often either crap, late or both.

If you want to play recent Mario games you have to buy a Wii (or maybe screw arround with emulation on a PC but most people won't bother to go to that much trouble). If you want to play GTA4 you have to buy a PS3, an xbox 360 or a high end gaming PC. If you want to play recent halo you have to buy an xbox 360. If you want to play recent ratchet and clank you have to buy a PS3.

OTOH PCs all draw from the same pool of games. With a low end PC some games are unplayable and others require you to turn down the settings and/or tolerate slowdowns. With a high end PC you should be able to run all games well (other than driver bugs which can hit both low and high end systems :( ).

Re:Turbo Mode (1)

aztektum (170569) | more than 3 years ago | (#33570052)

Define quality. If you mean stable and get the job done for most people, then I'd say yes. If you mean blazing speed and can run the latest games at a framerate that would melt your face, then no. ATi and nVidia have it covered and Intel is sticking with what they do best, general computer processors.

Also their Linux driver support is top-notch in this area.

Re:Turbo Mode (1)

toastar (573882) | more than 3 years ago | (#33570298)

Define quality. If you mean stable and get the job done for most people, then I'd say yes. If you mean blazing speed and can run the latest games at a framerate that would melt your face, then no. ATi and nVidia have it covered and Intel is sticking with what they do best, general computer processors.

Also their Linux driver support is top-notch in this area.

By quality I mean it's comparable to a current $200 video card.

I don't need to play crysis at 60fps, But I sure would like to at least not have OpenGL 3.3 games be stereoscopic.

Re:Turbo Mode (0)

Anonymous Coward | more than 3 years ago | (#33569644)

To expand on this, graphics cards are really where it's at now. The playstation 3 has the cell SPU's, many media players and cell phones have dedicated DSPs, and now graphics cards are becoming the equivalent mass number cruncher for PCs. While they certainly aren't suited for many tasks, they do have quite a few advantages over good old x86(_64):
  1. High level language. OpenCL is the dominant interface for programming GPUs, and it is much higher level than other systems, and especially the aging x86 instruction set. While this kills off hand optimization, it also increases portability, which leads to increased performance. Graphics card makers are free to implement any level of support into their graphics cards, and any architecture that they sit - OpenCL is surprisingly adaptable. It performs well on both nvidia's scalar architecture and ati's vector architecture, for example. It can even run decently on boring old x86 with SSE. The graphics card architecture can be changed drastically in the future without affecting backwards compatability.
  2. Optimized for a particular job. This one isn't so much a "benefit" but it does help out for many applications. GPU's are generally optimized for maximum throughput - hence they are often deeply pipelined, vectorized, and outfitted with very fast (and large) floating point units. This can be a disadvantage, as branching performance usually suffers (and can really screw up a vectorized implementation).

Well, I guess that was only two items, but still - the emphasis is that for what most people want speed for, the CPU is not the place to get it. Video and image processing, game geometry, and 2d rendering really belong on a GPU-like architecture, not the CPU.

Re:Turbo Mode (0)

Anonymous Coward | more than 3 years ago | (#33569686)

portability does not lead to increased performance generally speaking.. it's usually the opposite.

Game geometry and branching (1)

tepples (727027) | more than 3 years ago | (#33569886)

branching performance usually suffers [...] Video and image processing, game geometry, and 2d rendering really belong on a GPU-like architecture, not the CPU.

I thought game geometry involved a lot of branching, especially in the cases of potentially visible set construction methods (e.g. portal casting or BSP), collision detection, and path finding. Or have these problems been solved?

Re:Turbo Mode (1)

Henriok (6762) | more than 3 years ago | (#33570952)

That would be the integrated x87 FPU that's been present since the 486, now with with MMX, SSE, SSE2, SSE3, SSE4 and AVX.

second generation core? (1)

seeker_1us (1203072) | more than 3 years ago | (#33569462)

"Core 2" chips were out years ago.

Re:second generation core? (0)

Profane MuthaFucka (574406) | more than 3 years ago | (#33569632)

The words core and chips are there, but you made no attempt at an obvious turd joke.

Re:second generation core? (1, Funny)

Anonymous Coward | more than 3 years ago | (#33569722)

I think Intel wants to confuse the market even more with expressions like "Core 2 quad core 2nd gen".

Re:second generation core? (0)

Anonymous Coward | more than 3 years ago | (#33569910)

How about Core i8 Quad 793g+ Turbo? ( Core trademark, me )

Re:second generation core? (0, Troll)

IICV (652597) | more than 3 years ago | (#33569950)

Yeah, I'm getting a pretty strong sense of deja-vu from this. Intel released their ill-considred 64-bit x86 extensions before AMD, but we all know what happened to the good ship Itanic.

Now AMD acquires ATI and starts making noises about releasing integrated CPU/GPUs, and what do we see? Intel releasing the same class of thing, in a package that runs hotter and draws more power which is exactly the opposite of what you want in a mobile computer (which, I would imagine, is where you're most likely to see these chips being used).

Seriously, Intel, what are you guys doing?

Re:second generation core? (4, Informative)

Anonymous Coward | more than 3 years ago | (#33570336)

Wow. The nonsense..it hurts my brain.

First, IA64 is not a "64-bit x86 extension", it's a new ISA. AMD released x86_64 and Intel did very shortly after.

Second, Intel has had integrated CPU/GPUs out for a while. And you're crazy if you think Intel chips (now, not back in the bad old P4 days) draw more power and run hotter than AMD chips.

Basically everything you said is either wrong or backwards, and you confuse me because of this.

Re:second generation core? (0)

Anonymous Coward | more than 3 years ago | (#33570356)

Let's see the TDP figures for the AMD products you claim are superior then.

Re:second generation core? (1)

higuita (129722) | more than 3 years ago | (#33570882)

Dont forget that isnt just the CPU that eats power and gets hot...

Intel standalone CPU figures are nice, but add the required external chipset and those will look bad against the AMD CPU+integrated northbridge

its just a matter of oranges and apples, you need to see things as they really are, not marketing figures

Re:second generation core? (0)

Anonymous Coward | more than 3 years ago | (#33570964)

The recent Intel CPUs (i3, i5, i7) have the northbridge integrated, so it's apples and apples.

This is after their announcement that (2, Funny)

Centurix (249778) | more than 3 years ago | (#33569468)

They're opening a new factory in Madison county.

Intel needs to dump the DMI bus and go all QPI (1)

Joe The Dragon (967727) | more than 3 years ago | (#33569474)

Intel needs to dump the DMI bus and go all QPI the last thing you want is Intel video lock in and only x16 pci-e lanes.

Re:Intel needs to dump the DMI bus and go all QPI (0)

Anonymous Coward | more than 3 years ago | (#33569600)

Why? For ultrathin laptops and small factor desktops without room for a full x16 PCI-E lane, that doesn't make any sense.

3th party chip sets also apple. They can't stay co (1)

Joe The Dragon (967727) | more than 3 years ago | (#33569650)

3th party chip sets also apple. They can't stay core2 for ever on the mini / some of there laptops and intel video does not fit in there gpu api.
and they don't like to put full pci-e x16 video chips in there low end systems.

Re:3th party chip sets also apple. They can't stay (0)

Anonymous Coward | more than 3 years ago | (#33569960)

drunk monkey teenager translation:

Apple also has 3rd party chipsets. Monkey want bannanas. They can't continue using Core 2 forever on the mini, because their graphics librarys are already too much of a bottleneck. Monkey like kittens. Apple doesn't like to put full fledged PCI-e x16 interfaces in their low end systems, for some mysterious reason. Kittens, tasty.

Re:Intel needs to dump the DMI bus and go all QPI (1)

WarJolt (990309) | more than 3 years ago | (#33569604)

Affordable QPI on notebooks would be a hit.

Re:Intel needs to dump the DMI bus and go all QPI (1)

petermgreen (876956) | more than 3 years ago | (#33570258)

Ultimately for laptops and low end desktops moving all the high speed logic (graphics, CPU, memory controller) into one chip makes a lot of sense from both a cost and a power point of view.

Yes it's annoying that the option of a nvidia chipset with integrated graphics that were better than intel's while being cheaper and lower power than a dedicated graphics chip with it's own memory has been frozen out by this change.

Yes it's annoying that you can no longer use a low end CPU with a high end platform or vice versa like you could in the core 2 days.

But while the former of those is annoying for laptop gamers and the latter is annoying for those whose CPU and IO requirements are mismatched that does NOT mean that the decision to go for greater integration was the wrong one for intel.

Time to buy all new chipsets! (1)

assemblerex (1275164) | more than 3 years ago | (#33569484)

Because this is primary motivation as no one is even coming close to maxing out an i7.

Re:Time to buy all new chipsets! (1)

0123456 (636235) | more than 3 years ago | (#33569670)

Because this is primary motivation as no one is even coming close to maxing out an i7.

Maxing it out at what? Maxing out an i7's CPU performance is trivial on a server that's doing CPU-intensive work.

Re:Time to buy all new chipsets! (5, Interesting)

symbolset (646467) | more than 3 years ago | (#33569772)

Yeah, but for virtualization workloads we're seeing that the processor really isn't being taxed at all. Basically the controlling factor is the amount of RAM and I/O latency. Speaking of which... Sandy Bridge is only two channels of RAM per socket instead of the current three.

Re:Time to buy all new chipsets! (2, Informative)

Cylix (55374) | more than 3 years ago | (#33569952)

I'm not sure about the desktop side, but on the server side it is certainly not two dimms.

Each bank is composed of three dimms and there are multiple channels per proc.

While I don't have the details on me it's pretty easy to see that both camps have significantly increased their memory footprint and it's quite easy to build a system with 256gb of ram or greater.

In a few instances there are systems types which do tax the proc far more then others. For these types of systems and other instances where licensing per core is extremely costly there is another type of processor which has a significantly higher clock frequency, but the trade off is far fewer cores. (This is entirely a good thing when considering licensed applications).

Re:Time to buy all new chipsets! (1)

petermgreen (876956) | more than 3 years ago | (#33570342)

Currently with intel stuff (it's a while since i've looked at the AMD side) the laptop and low end desktop platforms have 2 channels and at least with the boards i've seen the max configuration supported is 4x4GB for a total of 16GB.

The current intel high end desktop platform has 3 channels and at least with the boards i've seen the max configuration supported is 4x4GB for a total of 16GB

Workstation/server platforms go much higher, with the right board and a big enough budget you can get 18x8GB (maybe more now if larger modules have appeared) on a dual socket platform and more on the 4/8 socket platform (though all the boards i've seen for that platform have not come even close to maxing out the chipsets memory capability)

All the details i've seen about sandy bridge seem to reffer to a laptop/low end desktop platform. I've seen a mention of a high end variant but no real details of it.

Re:Time to buy all new chipsets! (0)

Anonymous Coward | more than 3 years ago | (#33570238)

Because this is primary motivation as no one is even coming close to maxing out an i7.

That's bullshit. Just launch make -j $number_of_cores on a mid-size project and you're maxed out. And it's still dog slow.

Hehehe (4, Funny)

squiggly12 (1298191) | more than 3 years ago | (#33569520)

Please let me push a button on the case to enable "turbo" mode.

Re:Hehehe (5, Funny)

vertinox (846076) | more than 3 years ago | (#33569608)

Please let me push a button on the case to enable "turbo" mode.

Lol. Those were the days. I once worked in a computer shop in the mid 90's where we upgraded some guys 386 to one of the new 486 (DX i think) by swapping out the entire board but we kept the case to save him some money.

He comes back in the shop and complains that the turbo mode doesn't work anymore and we tried to explain with the new models that it was way faster than the 386 even in turbo mode but he didn't seem to understand.

So one of us takes it into the back rigs the button to simply light up the turbu LED when you press it. He seemed pretty happy with the results.

Re:Hehehe (1)

syousef (465911) | more than 3 years ago | (#33570010)

He comes back in the shop and complains that the turbo mode doesn't work anymore and we tried to explain with the new models that it was way faster than the 386 even in turbo mode but he didn't seem to understand.

So one of us takes it into the back rigs the button to simply light up the turbu LED when you press it. He seemed pretty happy with the results.

You should have sold it as a 486 special edition. Would have been cool if you could rigged it up with a speaker to have an extra loud fan noise too ;-)

Re:Hehehe (1)

Keen Anthony (762006) | more than 3 years ago | (#33570078)

I loved that little light. I won't buy a PC that doesn't give me that little light and a little button to activate it with. Add the ability for me to set the color of the light programmatically, and I'll be brand loyal.

Re:Hehehe (4, Interesting)

DigiShaman (671371) | more than 3 years ago | (#33570400)

The ironic thing is, that the "Turbo" speed was actually the native speed of the CPU. When you disabled turbo, you were actually underclocking it so that applications (games really), would run slower.

Basically, the parent wants to use the turbo button to overclock the CPU. This is the opposite implementation of when used in the past.

My turbo button really worked! :) (2, Interesting)

higuita (129722) | more than 3 years ago | (#33570978)

i had a AMD 486 DX5 at 133MHz on a 386 case, after some upgrades...

i connected the turbo button to the Bus speed jumpers, so when i pressed, the bus jumped from 33Mhz to 40Mhz, overclocking the cpu to 160Mhz... i run at "full" speed when i was at home and put the normal speed when i left it idle

To my surprise, it worked really well, the PCI bus accepted that speed, the network and SCSI card never gave any error until i disconnect the computer about 6 years ago

i also tried to up the bus to 50Mhz and the CPU, RAM, the vesa local bus (for the graphic card) and the ISA bus (sound card) worked fine, but it was too much for the PCI bus and the network and scsi cards didnt work so gave up from having a 200MHz 486 CPU and fall back to the already "good" 160Mhz... relative power of the setup was about a Pentium 90-100Mhz... running at normal 133Mhz, the performance was a little lower than a Pentium 75Mhz

Re:Hehehe (1)

Zoxed (676559) | more than 3 years ago | (#33570428)

> So one of us takes it into the back rigs the button to simply light up the turbu LED when you press it. He seemed pretty happy with the results.

I bet his amp goes up to 11 as well !

Re:Hehehe (2, Funny)

ricky-road-flats (770129) | more than 3 years ago | (#33570840)

Reminds me of a story too...

My Dad had a new-ish 386 PC which he loved, he especially loved how fast it was. One weekend I played some games on it, one of which (maybe Level 42?) needed the turbo off, as it was way too fast to play at the full nosebleed-speed of 33 MHz. I then went away for the week.

When I came back that Saturday lunchtime, he was literally waiting on the driveway for em, purple with fury. He'd been struggling for the whole week with an unuseably slow PC, and he'd tried rebooting, and he'd tried lots of things, and it had ruined his week... basically he was ready to murder me, and woe betide me if I didn't fix it pronto.

I was in a panic - what the hell had broken to make it so slow? Was it something I'd got from a BBS with a virus? Was it some TSR causing an issue?

The panic ended when I walked into his study, and from across the room saw the Turbo light off. I walked over to it, pressed Turbo, and let him try again. Problem solved. It was years before he could laugh about it...

That reminds me, I shoudl dig out Level 42 and try it on my 3.4 GHz machine... maybe running it in DOS in Bochs would slow it down enough?

Re:Hehehe (1)

Megahard (1053072) | more than 3 years ago | (#33569664)

Those old chips may have had a Turbo mode, but the new ones will go to 11.

Re:Hehehe (1)

sokoban (142301) | more than 3 years ago | (#33570114)

And in 2 years they'll go to 22.

Laptops still have a turbo mode (1)

tepples (727027) | more than 3 years ago | (#33569900)

Please let me push a button on the case to enable "turbo" mode.

It's not a button on the case, but several laptops give the user a taskbar control to change the power-management strategy. So you have a "turbo" setting and a "battery life" setting.

Yay for heating my house! (0, Troll)

TD-Linux (1295697) | more than 3 years ago | (#33569548)

For serious? Their leading feature is the ability to run even less efficiency than before? Extra speed at expense of power is great for desktops - too bad people hardly buy those anymore. Some of us actually want to run these things ON A BATTERY - can you imagine? Then again, being in Minnesota, I welcome the added heat.

Re:Yay for heating my house! (2, Insightful)

mirix (1649853) | more than 3 years ago | (#33569578)

Yeah. No one ever buys a desktop, and they certainly don't ever want it to be faster.

Re:Yay for heating my house! (2, Interesting)

IICV (652597) | more than 3 years ago | (#33569976)

If I want to make my desktop faster, I can replace the graphics card or CPU independently - it's big enough that an integrated CPU/GPU solution doesn't really make that much sense yet.

Mobile devices, on the other hand, make a lot more sense; if you can integrate the CPU and GPU on one chip with a reasonable max TDP, that's significantly less complexity in the design woth more computing power. You should see the heatsink arrangement in my HP laptop with a discrete CPU and GPU - it's insane, heat pipes and fans everywhere.

Re:Yay for heating my house! (1)

keatonguy (1001680) | more than 3 years ago | (#33569584)

I'm sorry, what? People hardly buy desktops anymore? Are you posting from 2025 or something?

Re:Yay for heating my house! (1)

TD-Linux (1295697) | more than 3 years ago | (#33569708)

Yeah, that might have been a bit on the extreme side - however, desktops are currently 32% of sales and falling.
Meh, maybe I'm just an embedded person who treasures ARM above all else and thinks that 640k ought to be enough for anyone.

GBA (0, Offtopic)

tepples (727027) | more than 3 years ago | (#33569914)

Meh, maybe I'm just an embedded person who treasures ARM above all else and thinks that 640k ought to be enough for anyone.

Then go ahead and stick to your Game Boy Advance with its ARM7TDMI CPU and 384 KiB of RAM. If it's good enough for TOD [pineight.com] , it might be good enough for you.

Re:Yay for heating my house! (2, Insightful)

antifoidulus (807088) | more than 3 years ago | (#33569768)

You do realize that a) Intel makes mobile chips as well that take power saving into consideration and b) TFA doesn't say it, but this feature will almost certainly be configurable by the bios and/or OS.

Re:Yay for heating my house! (1)

0123456 (636235) | more than 3 years ago | (#33569924)

You do realize that a) Intel makes mobile chips as well that take power saving into consideration and b) TFA doesn't say it, but this feature will almost certainly be configurable by the bios and/or OS.

Indeed: in normal use while web-browsing and the like -- at least according to the Linux battery monitor -- my i5 laptop takes only slightly more power than my Atom netbook. But if I plug it into the wall I can play any modern game decently (with it's Nvidia GPU, not whatever's integrated with the CPU).

Re:Yay for heating my house! (0)

Anonymous Coward | more than 3 years ago | (#33570014)

i agree, i put together a 980x/rIIIe/gtx480 for myself before the summer.
this fookin biznitch ate more juice than my A/C.
sent my elec bill thru the roof.
plus it heated up the house so much, that my fridge and a/c had to work twice as hard lol
again, elec bill was sky high

-HasHie, nyc

Re:Yay for heating my house! (1)

Vegemeister (1259976) | more than 3 years ago | (#33570834)

English, motherfucker, do you speak it?

Re:Yay for heating my house! (1)

Vegemeister (1259976) | more than 3 years ago | (#33570802)

The idea is to make the power consumption scale over a wider range proportional to the load. If you have a CPU that uses 13% of its TDP at 10% load, you can use a much beefier chip in the same application than if, say it consumed 80% of its TDP at 10% load. You can keep the power consumption the same but greatly increase the perceived responsiveness.

I, for one, welcome the day when the battery life of my laptop is dominated by the load average instead of how long it's been running. My phone can play 720p video, and the battery is good for barely more than a single movie. However, it'll last for days on standby in my pocket. Imagine that dynamic range with your computer.

Re:Yay for heating my house! (1)

HasHie (1900662) | more than 3 years ago | (#33571008)

i agree, i put together a 980x/rIIIe/gtx480 for myself before the summer.
this fookin biznitch ate more juice than my A/C.
sent my elec bill thru the roof.
plus it heated up the house so much, that my fridge and a/c had to work twice as hard lol
again, elec bill was sky high

-HasHie, nyc

P.S just made a /. acct

Wow... (3, Funny)

bennomatic (691188) | more than 3 years ago | (#33569570)

...I just drove over the Sandy Bridge this evening. Coincidence? I don't think so!

Re:Wow... (0)

Anonymous Coward | more than 3 years ago | (#33569996)

And I just met a sandy vagina there. Double coincidence? You decide.

Re:Wow... (0)

Anonymous Coward | more than 3 years ago | (#33570604)

paging the 'good news bible' asshat here.

Any news for Apple in this (1)

AHuxley (892839) | more than 3 years ago | (#33569596)

Better/different selection of real gpu's? Or is this just all about a slow all in one Intel on one chip gpu option?

Re:Any news for Apple in this (1)

Wesley Felter (138342) | more than 3 years ago | (#33569642)

The Sandy Bridge GPU is still weak by Apple's standards, but they can't keep using Core 2 forever. As long as OS X can compile OpenCL into AVX code I think Sandy Bridge will work OK in future MacBooks.

Re:Any news for Apple in this (1, Flamebait)

AHuxley (892839) | more than 3 years ago | (#33569712)

weak by Apple's standards ... it must really suck.

Re:Any news for Apple in this (0)

Anonymous Coward | more than 3 years ago | (#33569958)

Why do you need a screaming GPU for a Mac anyway? It's not like you're going to play games on it.

Re:Any news for Apple in this (1)

Yvan256 (722131) | more than 3 years ago | (#33569728)

How is that new intel integrated GPU compared to the nVidia 320M currently used by Apple?

The important question is... (1)

keatonguy (1001680) | more than 3 years ago | (#33569598)

...will it use a new socket? I just shelled out for a fresh build because my mobo's processor socket was deprecated, I really hope they don't turn around and change it again so soon.

Re:The important question is... (2, Insightful)

mirix (1649853) | more than 3 years ago | (#33569624)

I'm assuming that even if there are dead pins on the current socket, that can be used for the video portion, no existing boards will have this capability... so it wouldn't matter anyway, right?

thinkin' new socket.

Re:The important question is... (0)

Anonymous Coward | more than 3 years ago | (#33569694)

The new socket is called LGA 1155, LGA 1356 and LGA 2011 from low to high-end markets. Yay! All new sockets!

Re:The important question is... (3, Informative)

sdot1103 (939642) | more than 3 years ago | (#33569724)

Yup if you shelled out for a Socket 1366 (high end i7), you're going to be sticking with Nehalem until Socket R comes out down the line.
If you went with 1156, which I did (P55 Classified + i7 860 @ 4.0 Ghz), then you're screwed, just earlier, since it's now Socket 1155, which isn't compatible even though it's just a 1 pin difference.
I wasn't very happy with Intel when I found this out, since they've recently switched sockets after holding on to 775 for so long, but from my understanding AMD has also done something with the AM-2/3 socket where some motherboards are back/forwards compatible, but others aren't. I think there is a derivative socket, Am-2/3+, that is backward compatible but the Am2-3 standard version isn't forwards compatible. Don't take my word on it though, my builds have been Intel since the Q6600 came out. AMD has done a better job of backward compatibility but the sweet spot for price/performance + overclocking has been Intel chips whenever I've done my last few builds, and I only do builds every few years, usually after new architectures are released so my motehrboards are usually replaced as well.
Anandtech covered upcoming socket changes in more detail in their writeup [anandtech.com]

Re:The important question is... (0)

Anonymous Coward | more than 3 years ago | (#33570130)

AM2 procs won't work on AM3 mobos due to the lack of a DDR3 controller.
The reverse DOES however, assuming proper voltage support (AM2+ is needed for the split pane power supply, although I'm not sure if that's REQUIRED, or just for stability.)

Honestly if you look around there's *Socket 939* boards using the 785G chipset, and 690V boards using AM3, so AMD's current chipsets are pretty backward/forwards compatible.

Mind you my current reason for siding with AMD this generation is the 890FX. The cheapest mobo available for IOMMU support as well as 2+ x16 PCIe slots.

Re:The important question is... (1)

Osgeld (1900440) | more than 3 years ago | (#33569726)

Of course it does, Intel farted, which automatically requires a new socket and 4 more power supply pins

Re:The important question is... (1)

cheater512 (783349) | more than 3 years ago | (#33569760)

And that is why I go for AMD. Identical socket with extremely gradual incompatibilities.

Not a single word on Intel killing overclocking? (3, Interesting)

Kartu (1490911) | more than 3 years ago | (#33569920)

Not a single word on Intel killing overclocking, eh? According to anand's article majority of new CPU's won't allow ANY kind of overclocking.

Re:Not a single word on Intel killing overclocking (4, Funny)

H0p313ss (811249) | more than 3 years ago | (#33570172)

Not a single word on Intel killing overclocking, eh? According to anand's article majority of new CPU's won't allow ANY kind of overclocking.

And 128 nerds cried themselves to sleep... :)

dont expect a 980x killer just yet (0)

Anonymous Coward | more than 3 years ago | (#33569926)

i doubt intel is ready to cannabalize its existing 'i7 extreme' lineup just yet.
specially since no apps that the average douche uses come even close to fully utilizing a 980x.
expect a powerfriendly, i5 type chip at first.

wat i wonder, is if theyll finally use a decent chipset (x58 blows) with more lanes for pci express.
i would like to see motherboards capable of using 4 or more graphics cards in full x16 setups (not fractional crap like 16x/16x/8x or 8x/8x/8x/8x)
quad channel memory configs anyone?
also i would like the integrated memory controller to be able to handle a higher voltage load, allowing higher ram overclocks

btw, executing gfx api's using the vector instruction circuits on the cpu, wont be remotely as fast as a discrete chip made just for that (gfx card).
They wont have enough transistors set aside for vector circuits to entirely handle a modern commercial 3D game's requirements.
also the instructions theyve chosen for the avx set dont help the execution of 3d api's at all.

-HasHie

Really need to rationalise naming (3, Insightful)

Arimus (198136) | more than 3 years ago | (#33570080)

Thought the '2' in Core2 referred to the second generation already...

With the Core i3/5/7 being the third these are more like the fourth generation.

Might be time for people who make C(G)PUs to have a rethink on naming schemes... maybe even take a leaf out of the software industry, e.g

Core i .

Re:Really need to rationalise naming (1)

Arimus (198136) | more than 3 years ago | (#33570100)

Hm. SNAFU something grabbed rest of my post. :-(

Anyway

Core i[number of cores] [major design].[revision] would ease confusion.... then just have list of major design to code name mappings (or even append the name when listing cpu's on your product pages if you are a vendor) and job is done; least in terms of stopping the naming confusion.

Re:Really need to rationalise naming (1)

dakameleon (1126377) | more than 3 years ago | (#33570294)

Maybe it'll be Core ii X ? The i is roman numeral lower case?

Core ii9 here we come!

What is TDP? (1)

clone53421 (1310749) | more than 3 years ago | (#33570128)

From TFA:

Of course, we are left wondering what TDP means now, if exceeding it is standard.

Ironically, I was already wondering that. It never told what TDP is, and a Google define: search wasn’t terribly enlightening.

Re:What is TDP? (2, Informative)

totally bogus dude (1040246) | more than 3 years ago | (#33570260)

Thermal Design Power. Basically a measure of the amount of cooling required to prevent the chip frying.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>