Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Sandy Bridge Desktop and Mobile CPUs

CmdrTaco posted more than 3 years ago | from the touch-the-future dept.

Intel 116

Vigile writes "The new Intel Sandy Bridge architecture is being launched at CES this week but the reviews and benchmarks are out today. PC Perspective took a look at both the desktop and mobile variants, the former of which turns out to be quite an impressive processor for both highly threaded and single threaded applications. With some tweaks to the execution unit, a new Turbo Boost mode that increases clock speeds dynamically and a vastly improved integrated graphics implementation, the Core i7-2600K improves in every aspect. Also interestingly, the most expensive desktop part will start at $317, putting the screws to AMD yet again. On the mobile side of things, PC Perspective tested the quad-core Core i7-2820QM and the benchmark results are equally impressive; especially when looking at the gaming performance using integrated graphics. Sandy Bridge will no doubt put quite a dent in the discrete notebook graphics market for NVIDIA and AMD."

cancel ×

116 comments

Sorry! There are no comments related to the filter you selected.

I have to say, (-1)

Anonymous Coward | more than 3 years ago | (#34742990)

I just woke up and the gases hanging out under the covers are disturbing.

Impressive graphics ? (1, Insightful)

RedK (112790) | more than 3 years ago | (#34743040)

What benchmarks is the poster reading exactly ? On the Mac side, the SB IGP barely beat out the current nVidia 320M in shipping MacBooks, at low settings (a CPU bound task) and couldn't match the performance at medium settings meaning the SB IGP is slower than nVidia's offering from 2009!

There's nothing impressive, this is standard Intel IGP fare.

Is this the Tock? (2, Informative)

cyclocommuter (762131) | more than 3 years ago | (#34743142)

The Sandy Bridge architecture, aside from the die shrink and subsequent increase in clock rate which that entails, in my opinion, is not that much of an improvement over the previous i7 Lynnfield architecture (i7 860, 870, 875k, 880). Here is an article that benchmarks [inpai.com.cn] a Sandy Bridge CPU vs an i875k where the frequency of both processors set to 3.4 GHZ... not that big of an improvement.

Funny thing is many of the articles today are praising the chip as a big improvement over Lynnfield not making it clear that this is most likely due to the clock rate increase.

Re:Is this the Tock? (1)

gabebear (251933) | more than 3 years ago | (#34743522)

The i875 is a 95W TDP CPU!!! The 2820QM has a TDP of 45W.

Re:Is this the Tock? (1)

gabebear (251933) | more than 3 years ago | (#34743532)

That's 95W before you overclock it from 3.0ghz to 3.4ghz...

Re:Is this the Tock? (2, Informative)

Anonymous Coward | more than 3 years ago | (#34743608)

All if they don't like you they can disable the processor from afar. All that at no extra cost! That will be a boon for stopping the computers spreading to countries they don't like.

http://www.techspot.com/news/41643-intels-sandy-bridge-processors-have-a-remote-kill-switch.html [techspot.com]

Re:Is this the Tock? (0)

Anonymous Coward | more than 3 years ago | (#34743846)

Looks like it will be tin foil hats for our computers now.

Re:Is this the Tock? (1)

RightSaidFred99 (874576) | more than 3 years ago | (#34745026)

Oooh, yeah, that's _totally_ what they're going to use it for! Thank you for this non-retarded post making a meaningful point!

OK, in my old age I find sarcasm more and more of a lame mechanism, but since your post is so silly I don't feel bad.

Re:Is this the Tock? (0)

Anonymous Coward | more than 3 years ago | (#34748242)

So you think the kill switch is a good thing then? You think that Intel did it to help the poor people that loose their computers? Nice of them. No wonder we all get so f**ked over so easily with suckers like that.

... and an Embedded DRM (1)

cyclocommuter (762131) | more than 3 years ago | (#34748416)

Apparently plus an embedded DRM according to this article from The Inquirer [theinquirer.net] . Previous generation is looking better and better.

Re:Is this the Tock? (1)

robthebloke (1308483) | more than 3 years ago | (#34748738)

No. Read up on Intel AVX. The SIMD registers have been doubled in size from the previous generation. You can now do 8 floats per instruction instead of 4..... :)

That's most likely the source of the performance increase... not the clock speed increase

Re:Impressive graphics ? (-1)

Anonymous Coward | more than 3 years ago | (#34743168)

Because Macs matter, oh wait.

Re:Impressive graphics ? (1)

declain (1338567) | more than 3 years ago | (#34743338)

Nothing impressive in 3D, but excellent in video encoding/decoding which is much better than current AMD/nVidia cards.

Re:Impressive graphics ? (1)

robthebloke (1308483) | more than 3 years ago | (#34748768)

Apart from the fact they are AVX enabled (256bit SSE registers). Since the Intel graphics chips are partially implemented in software, there should be a bit of a performance boost in the TnL pipeline. An Nvidia/Ati card would perform better though (and are easier to code for)

Re:Impressive graphics ? (0)

wisty (1335733) | more than 3 years ago | (#34743632)

I'm a bit of a Mac fanboy, and I do like the Core2Duo / nVidia combo, but C2D is now two generations behind. OK, IGP is roughly two generations behind nVidia, but it's harder to compare.

The only ray of hope is Intel's new mini SSD HDD - which might leave some room for discrete GPU.

Re:Impressive graphics ? (1)

Anonymous Coward | more than 3 years ago | (#34743678)

320M still ships with most of notebooks with nvidia GPU. It is still in active use.

The real deal is integrating a decent (320M or R5450) GPU on your notebook, then you don't NEED 320M on your notebook. High-end gamers will still go Alienware and your mom and dad will probably never use the full graphics power of their GPU. They might also find the HD video decoding capabilities of the GPU quite nice and intel has been quite fast in providing driver support.

This is a serious blow to nvidia. Why pay them more to have their GPU on your notebook, when you can get a decent one integrated to your processor die? Most laptop users are not interrested in real games, anyway.

Also, as a desktop GPU this is just about perfect for any other use except real hardcore gaming. It runs accelerated desktop and apps and decodes video to HDMI output. I couldn't ask more for HTPC.

This is evolution, not revolution, and since nvidia can't integrate their GPUs to CPU packages, like AMD, they are going to have to come up with something smart real soon now..

Re:Impressive graphics ? (1)

RedK (112790) | more than 3 years ago | (#34744466)

In the case of Macs, the 320M is an integrated part. This Intel IGP is slower than an nVidia IGP from 2009. There is not evolution at all, this is regression.

Re:Impressive graphics ? (1)

beelsebob (529313) | more than 3 years ago | (#34744812)

I don't know what benchmarks you're reading, but anandtech's show the intel IGP beating a discrete 320m in 5 out of 6 tests. As far as I'm concerned the evolution here is:

2009: Core2Duo + GeForce 320M integrated – 50W total
2011: Sandy Bridge quad core – 45W total

So, we've dropped in power consumption slightly, added two extra cores, got a (slightly) faster graphics card, the cores on the CPU are each much faster than the cores on the original, and last but not least we've gained hardware video encoding support.

Yes it's evolution, but it's a damn good evolution.

Aside: in 2009 if we wanted a GPU capable of actually playing games we needed not just the Core2Duo and the nVidia chipset, but also a discrete graphics card; now we only need the i7 and the discrete graphics card, that's *still* a chip fewer for better performance at lower power consumption, even for higher end things.

Re:Impressive graphics ? (1)

RedK (112790) | more than 3 years ago | (#34745174)

I don't know what benchmarks you're reading, but the 320M in the MacBook Pro 13" he used are not a discrete/dedicated card, they are integrated to the system chipset, they very much are an IGP.

Re:Impressive graphics ? (1)

beelsebob (529313) | more than 3 years ago | (#34747364)

anandtech didn't test with the MacBook Pro – they tested with a Core i5 machine with a discrete GT320m, the intel IGP won 5 of 6 tests.

Re:Impressive graphics ? (1)

RedK (112790) | more than 3 years ago | (#34749734)

Uh ? Anandtech did test the MacBook with a 320M. Check out this image from the review : http://images.anandtech.com/graphs/graph4084/34978.png. In fact, if you look closely at that graph, there is no mention of the GT320M on it, only the Apple MBP13 (P8600 + 320M). In a low settings scenario, SB's IGP barely beats out the nVidia 320M, but that is probably more a testament to the Core i7 chip vs the C2D than actual IGP performance. At medium settings, the tables are reversed and the 320M beats the SB IGP, again meaning Intel failed on the graphics side. All this time you argued with me you weren't even talking about the same thing ? Wow.

Re:Impressive graphics ? (3, Insightful)

timeOday (582209) | more than 3 years ago | (#34743692)

I disagree; if integrated graphics are now trailing discrete by only 12-16 months, then NVidia has a problem. Not many games require a graphics card less than a year old, and not many people bother to buy one that often. And the integrated solution will be overall cheaper, smaller, and more power efficient.

Re:Impressive graphics ? (5, Informative)

RedK (112790) | more than 3 years ago | (#34743724)

The 320M is not a discrete graphics option, it's an integrated graphics option, same as this SB GPU. So you disagree out of ignorance more than disagreement. This is again a really poor showing on Intel's part.

Re:Impressive graphics ? (1)

timeOday (582209) | more than 3 years ago | (#34744182)

OK, I see there are different models [notebookcheck.net] : "Beware: The GT 320M should not be confused with the newer GeForce 320M in the Apple MacBook 13" 04/2010 laptops, which is a chipset graphics card." And from the context, you clearly were referring to the Apple version.

Re:Impressive graphics ? (1)

RightSaidFred99 (874576) | more than 3 years ago | (#34745082)

Not sure what's Informative about your post, as it's incorrect. The 320M is not a CPU integrated GPU, which is what the SB GPU is. It's an off-chip option. Just because it's in a little chip doesn't make it dissimilar to a discrete card - it just has a lower power envelope and different IO configuration to memory and the CPU bus.

Guess what happens if you take the 320M and put it on a little PCI-express board with some different traces? yep - discrete graphics!

The 320M is a discrete graphics solution, and as such would be expected to outperform an on-board graphics solution that has to live with the power constraints of being onboard the CPU but it also benefits from..being on one single chip.

In short, I'm not sure you know what you're talking about as you're comparing apples and oranges and pretending they are both grapes.

Re:Impressive graphics ? (4, Informative)

RedK (112790) | more than 3 years ago | (#34745668)

The 320M used in Macs shares memory with main system memory. That used the be the definition of an integrated graphics part. Dedicated/discrete GPUs have their own memory, hence the dedicated/discrete part of the name. I've been following graphics cards/benchmarks/terminology since the mid-90s and 3Dfx's rise to fame.

The 320M I'm talking about and that Anand used is integrated in the chipset, same as all the Intel graphics before it, so it shares its die with a memory controller, a SATA controller, a PCI interface and a USB controller. It is the very definition of an integrated graphics part. Intel only decided to move the part from the chipset and integrate more on the CPU die itself. That doesn't make their showing any more impressive.

Re:Impressive graphics ? (2)

Shivetya (243324) | more than 3 years ago | (#34743790)

Well from the article it is my impression they are trying to work more with Direct X than any other solutions. Since the Mac does not support Microsoft Direct X graphics I assume we will still see Nvidia Discreet graphics for time to come.

Re:Impressive graphics ? (1)

robthebloke (1308483) | more than 3 years ago | (#34748670)

Well not quite. As I understand it, the vertex (and I'm assuming geom + compute + tessellation) shaders are all implemented in software on the CPU. Whereas the last generation of i7's were SSE4 enabled (4 floats per instruction), the new models come with AVX (8 floats per instruction). One would assume therefore there should be a significant improvement in 3D performance as a result.... (assuming memory bandwidth doesn't cause too many problems).

Additional Story Resources (5, Informative)

Anonymous Coward | more than 3 years ago | (#34743070)

This article is too significant to post only one source for the information. Here are the other top sites:

HotHardware Mobile: http://hothardware.com/Reviews/Intel-Core-i72820QM-Mobile-Sandy-Bridge-Processor-Review/ [hothardware.com]
HH Desktop: http://hothardware.com/Reviews/Intel-Core-i72600K-and-i52500K-Processors-Debut/ [hothardware.com]

Anandtech: http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i5-2600k-i5-2500k-and-core-i3-2100-tested [anandtech.com]
Tech Report: http://techreport.com/articles.x/20188 [techreport.com]

Legit Reviews: http://legitreviews.com/article/1506/1/ [legitreviews.com] (mobile)
Legit: http://legitreviews.com/article/1501/1/ [legitreviews.com] (desktop)

Re:Additional Story Resources (4, Informative)

Vigile (99919) | more than 3 years ago | (#34743612)

Agreed! The more people read about these products the better informed. A couple more:

bit-tech: http://bit-tech.net/hardware/2011/01/03/intel-sandy-bridge-review/1 [bit-tech.net]

Neoseeker: http://neoseeker.com/Articles/Hardware/Reviews/Intel_i7_2600K_Intel_i5_2500K [neoseeker.com]

Impressive? (0, Informative)

Anonymous Coward | more than 3 years ago | (#34743108)

They reviewed the best intel can offer, and it's significantly weaker than the $30, several generations old, bargain basement 3D cards they compared it to. It's a step up from the previous generation of intel graphics, but it's still very weak indeed - and I just know it'll be in everything because it's dirt cheap and laptop vendors manage to make it sound good (enough). Not impressed.

Re:Impressive? (1)

alen (225700) | more than 3 years ago | (#34743328)

nobody cares since the people buying these won't care about playing Black Ops at the highest settings. i sat out the Steam sale this year because almost every game is available for the X-Box 360 and PS3. I just want a laptop to play Civ 4 once in a while and store all my photos and music. and Sandy Bridge seems to spank Apple's laptops in a lot of areas now

Re:Impressive? (1)

gnasher719 (869701) | more than 3 years ago | (#34743628)

nobody cares since the people buying these won't care about playing Black Ops at the highest settings. i sat out the Steam sale this year because almost every game is available for the X-Box 360 and PS3. I just want a laptop to play Civ 4 once in a while and store all my photos and music. and Sandy Bridge seems to spank Apple's laptops in a lot of areas now

The problem is that the integrated graphics just barely keeps up with the MacBook graphics at lowest resolution because of the advanced CPU, but once the resolution is increased, Sandy Bridge integrated graphics gets slaughtered by the lowly NVidia 320 in the MacBook.

Re:Impressive? (0)

Anonymous Coward | more than 3 years ago | (#34743776)

Well, not everyone has a PS3 or an Xbox360. ;)
I share a flat with three other people I only barely know, so I spend most of my time there in my (rather cramped) room. I've got a PC there; buying a PS3 and connecting it to one of the monitors would feel wasted.

Anyway, more to the point: Even being able to play new games on low settings is going to be a stretch on these by the time they're widespread. Note how Civ V got abysmal framerates even on the most powerful chip, but was quite playable on a $70 card from a few generations ago. (Admittedly, the Civ V engine is quite bad.)

but apple is pushing open CL and weaker video does (1)

Joe The Dragon (967727) | more than 3 years ago | (#34750072)

but apple is pushing open CL and weaker video does not help.

Re:Impressive? (1)

UnknowingFool (672806) | more than 3 years ago | (#34743658)

It's good enough for most general purpose laptops. However if you wanted discrete graphics in your new laptop, you are going to have to take Intel's offerings no matter what. Just like the current generation of Core i mobile chipsets, the built-in graphics are merely disabled when you add your own separate graphics. I think Apple is the only manufacturer that has worked to get both to work by switching them on depending on load.

Duplicate Links (2)

Aldenissin (976329) | more than 3 years ago | (#34743116)

CmdrTaco, you duped the links, which appears to be an accident.

More Reviews... (5, Informative)

I.M.O.G. (811163) | more than 3 years ago | (#34743132)

Summary is a bit light on sources... pcper.com is good, but you should be looking at multiple reviews to get a well rounded perspective.

Here's a few:
http://www.overclockers.com/intel-i7-2600k-sandy-bridge-review [overclockers.com]
http://legitreviews.com/article/1501/1/ [legitreviews.com]
http://www.tweaktown.com/reviews/3754/intel_core_i7_2600k_and_core_i5_2500k_sandy_bridge_cpus/index.html [tweaktown.com]
http://www.hitechlegion.com/reviews/processors/7689-intel-core-i5-2500k-processor-review [hitechlegion.com]

As expected (1)

lennier1 (264730) | more than 3 years ago | (#34743158)

Personally I'm more looking forward to the octi-core units which are scheduled for in Q3 2011.
Combined with a decent dedicated GFX card they'll make a good basis for a new 3D workstation.

Goodbye LGA 1366 and 1156 (4, Interesting)

iamhassi (659463) | more than 3 years ago | (#34743160)

So the [overclock.net] rumors [slickdeals.net] are [overclockers.com.au] true: [answerdigger.com] according to the article all Sandy Bridge CPUs are Socket LGA 1155, [pcper.com] replacing the 18 month old LGA 1366 [slashdot.org] and 17 month old LGA 1156. [slashdot.org]

I'm all for bigger and better but it's a pain to throw away a $500 motherboard [newegg.com] every 18 months because Intel decided they want to change the socket.

On the other hand the latest 6-core processors from AMD [techguy.org] still support 3+ yr old AM2+ [legitreviews.com] motherboards. It's nice to see someone still looking out for the budget shopper.

Re:Goodbye LGA 1366 and 1156 (2)

networkBoy (774728) | more than 3 years ago | (#34743204)

I would attribute this mostly to growing pains related to moving the memory controller into the CPU directly. IIRC AMD had some socket thrash when they did this (though one certainly can wish Intel learned from AMD and only changed the socket once, not twice.)
-nB

Re:Goodbye LGA 1366 and 1156 (1)

aliquis (678370) | more than 3 years ago | (#34743378)

I don't get why 1366 gets no love, and why things like this isn't on 1366?

I would had paid extra for triple-channel over dual-channel, but it suck when the enthusiast gear lags behind the normal stuff :/

Sure they use higher clock on their dual channel sticks, but why no chipset and processors with higher clock triple channel?

The tripe channel CPUs perform better for SCII even though it only uses two of the cores. Enough said =P

Re:Goodbye LGA 1366 and 1156 (1)

Anonymous Coward | more than 3 years ago | (#34744602)

fyi triple channel are coming out on the 3rd quarter
the real replacement for lga1366 is lga1356 and the new lga2011

Re:Goodbye LGA 1366 and 1156 (1)

aliquis (678370) | more than 3 years ago | (#34749280)

That suck, so no upgrades of 1366 processors until then?

Will 1356 be the new mainstream and 2011 the new enthusiast/workstation/... or what? Guess I can check myself.

I don't know whatever 1366 is worth it or not. It seems like but it still feels older :)

Re:Goodbye LGA 1366 and 1156 (1)

mdm-adph (1030332) | more than 3 years ago | (#34743636)

Might have a little bit of truth to it, but I thought this was just par for the course -- AMD did the same socket-switching BS back in the 754-939-AM2 days, when they had the fastest chips.

Re:Goodbye LGA 1366 and 1156 (0)

Anonymous Coward | more than 3 years ago | (#34743306)

Your comment makes no sense. Are you really trying to say that you're willing to pay $500 for a top-of-the-line mobo just so you can cheapskate 18months later on a CPU that no onoe is forcing you to buy? Beyond that, do you really think that such a niche market as people upgrading just the CPU would factor in _anyone's_ plan? (Hint: you're not special and you're not a "budget shopper")

Besides, this is only the tock. Wait for each tick to upgrade...

Re:Goodbye LGA 1366 and 1156 (1)

Martin Blank (154261) | more than 3 years ago | (#34746008)

Sandy Bridge is the tick, not the tock, since it is a new architecture. Nehalem was the last tick, Westmere was last tock. Ivy Bridge is the next tock. Haswell is reported to be the next tick, then Rockwell as the following tock.

Re:Goodbye LGA 1366 and 1156 (0)

Anonymous Coward | more than 3 years ago | (#34746472)

Other way around, ticks are shrinks, tocks are new architectures.

http://www.intel.com/technology/tick-tock/index.htm

Re:Goodbye LGA 1366 and 1156 (4, Interesting)

Pojut (1027544) | more than 3 years ago | (#34743318)

This is pretty much the only reason I still stick with AMD. My upgrade cycles are every 2-4 years, so you'd think it would make more sense for me to go Intel since their stuff is "better". Nope! I've kept the same motherboard for the past two cycles, and even though I'm getting a better CPU (going from Ahtlon II x4 to Phenom II x6) and better video card (going from ATI 4850 to nVidia 570), I'm STILL going to keep the same motherboard and RAM. The Phenom II will be the third CPU I've dropped into this motherboard (Athlon X2 -> Athlon II X4 -> Phenom II X6).

Re:Goodbye LGA 1366 and 1156 (1)

Anonymous Coward | more than 3 years ago | (#34743324)

Did AMD pay you to post this?

First,if you are shopping on a budget, you aren't going to buy $500 dual-socket Xeon server board, you will buy an $80 budget one.

Second, if you are using a Socket 1366 or 1156 processor (newer than 18 months old), you probably aren't looking to upgrade, and if you are going to pitch your 18 month old hardware, you aren't going to be especially budget constrained.

Re:Goodbye LGA 1366 and 1156 (1)

Martin Blank (154261) | more than 3 years ago | (#34745962)

I bought an i7-920 when they first came out, and I'm happy with its performance as it stands. I may look to upgrade with Ivy Bridge next year, but will probably hold off until Haswell in 2013.

Re:Goodbye LGA 1366 and 1156 (0, Insightful)

Anonymous Coward | more than 3 years ago | (#34743620)

It's nice to see someone still looking out for the budget shopper.

Ah yes, the "budget shopper" that whines about having to replace a $500 mobo but not apparently about the $2000+ worth of Xeons (not to mention memory, etc) that go into it...

Seriously - unlike your mom, AMD won't make you breakfast no matter *how* much time you spend licking their asshole.

Re:Goodbye LGA 1366 and 1156 (1)

DigiShaman (671371) | more than 3 years ago | (#34743886)

Depends on how long want to keep your machine. Often, it pays to purchase a MB with quality capacitors and a rock-solid design with good support. That is to say, upgrade-ability is not the only consideration to keep in mind.

Re:Goodbye LGA 1366 and 1156 (4, Insightful)

Kjella (173770) | more than 3 years ago | (#34743954)

Hint: Dropping $300 on every processor generation Intel makes is a waste of money. If you got that much to spend, buy a more expensive CPU and keep it a generation or two longer. It not like it goes broke just because it's not the newest toy anymore, you know.

So in order of why is this is mostly irrelevant to the market:
1) The majority is laptops now (since 2008) and nobody upgrades the CPU there
2) Most people will get their desktop from an OEM and never upgrade
3) If you assume a new Intel will require a new mobo, you buy accordingly

Ok, so maybe you made a smart upgrade investment. Hurray, you belong to 1% of the market. Intel is still laughing all the way to the bank...

Re:Goodbye LGA 1366 and 1156 (0)

Anonymous Coward | more than 3 years ago | (#34744380)

Here's what I don't get about this kind of complaint: if you have a LGA1366 motherboard, that means you already have a kickass processor. Maybe it's not the latest and greatest anymore, but even so, it's still massively overwhelmingly awesome. How are you in the market for a new CPU for that particular box?

By the time your current processor starts to feel old, you're not going to want to use that motherboard anyway, since it's limited to 1333 MHz DDR3 RAM instead of 4000 MHz DDR5, has sockets for only two processors instead of the 8 sockets that real power users use, talks to a SATA-5 drive at a miserably crippled 3 GB/s, doesn't have any PCIeexee x64 lanes, the built-in ethernet can't talk at the 100 Gbps standard, and so on. My point is, no matter how cool your motherboard is, by the time your processor is obsolete (probably around 2017), the motherboard is going to be obsolete too, and not just because of the CPU socket, but everything else as well.

Enjoy your dual LDA 1366 CPUs. Trust me, the grass isn't that much greener on the other side. Your processors are just fucking fine, so be happy with them and don't worry about what socket the newer 3% better stuff uses. IT DOESN'T MATTER.

Re:Goodbye LGA 1366 and 1156 (1)

0100010001010011 (652467) | more than 3 years ago | (#34744752)

I just upgraded my 3 year old "AM2+" with a X4 640 AM3 (it needed a BIOS update).

Only problem is the north bridge heatsink wasn't designed for that much IO so I'm going to have to install a fan (it's literally too hot to touch).

That plus the fact that they dominate the PassMark's CPU/$ [cpubenchmark.net] benchmark means I'll be buying AMD for a long time...

Except if it's their GPU offerings. They need to pull their heads out of their asses and catch up with Linux Support to Nvidia.

Re:Goodbye LGA 1366 and 1156 (1)

Espectr0 (577637) | more than 3 years ago | (#34745004)

Sandy Bridge has been on sale in Australia for about two months.

I just got: i5 processor, gigabyte motherboard, 8 gigs ddr3 1333, ati radeon 5770, all for 740$

Re:Goodbye LGA 1366 and 1156 (1)

ender06 (913978) | more than 3 years ago | (#34745052)

That's got to be the first time Australia got something first or at least before the States. Lucky bastard :)

Re:Goodbye LGA 1366 and 1156 (0)

Anonymous Coward | more than 3 years ago | (#34749114)

Sandy Bridge has been on sale in Australia for about two months.

I just got: i5 processor, gigabyte motherboard, 8 gigs ddr3 1333, ati radeon 5770, all for 740$

Where from? My local store in Gold Coast has a bunch of Gigabyte motherboards with the 1155 socket, but no 1155 CPUs as yet.

Re:Goodbye LGA 1366 and 1156 (1)

crunchygranola (1954152) | more than 3 years ago | (#34746912)

I'm all for bigger and better but it's a pain to throw away a $500 motherboard every 18 months because Intel decided they want to change the socket.

I hear ya, but on the other hand your new 1155 mobo is likely to have 6 GB/sec SATA and USB 3.0 which your existing 1366 mobo most likely does not have. Changing out your mother board won't just get you a new socket.

Re:Goodbye LGA 1366 and 1156 (1)

Just Another Poster (894286) | more than 3 years ago | (#34749474)

I read on Wikipedia that the Sandy Bridge replacements for LGA 1366 high-end desktop CPUs aren't due out until Q3 2011. Maybe that's when X68 will be released.

DUNNO BOUT YOU BUT I AINT USING NO SANDY BRIDGE !! (0)

Anonymous Coward | more than 3 years ago | (#34743264)

Bridges are bad enough lately, and I ain't using NO SANDY BRIDGE DRM CRAP and that's a fact Jack !!

FIFY (0)

Anonymous Coward | more than 3 years ago | (#34743298)

a new MARKETING Boost mode that dumbs down your processor for most of your session

There. fixed that slashvertisment for you.

Re:FIFY (1)

aliquis (678370) | more than 3 years ago | (#34743422)

That would had been more of an issue if the competition had better chips.

They don't.

So regardless if you feel Intel is holding their chips back or not what are you going to do about it?

Re:FIFY (1)

Tr3vin (1220548) | more than 3 years ago | (#34743732)

I prefer my CPU to be running in a low power mode most of my session. TurboBoost also serves to keep power consumption and thermals within a good range while allowing single threaded tasks to run a bit faster than they would normally.

Re:FIFY (1)

TheRaven64 (641858) | more than 3 years ago | (#34744152)

No, Turbo Boost does not dumb down your processor. It turns an SMP system into an asymmetric multiprocessor system on demand. If you are running a single CPU-bound thread (a pretty common workload), then it overclocks one core and underclocks the others so that you get better single-thread performance but don't overstep the CPU's thermal dissipation limit.

Ideally, you'd never use Turbo Boost, because all of your applications would be written to use n threads (where n is variable depending on the number of cores available) and to perfectly evenly distribute work among them. Back in the real world, it provides a performance boost because CPU-bound tasks tend not to be perfectly evenly distributed among concurrent threads or processes.

Re:FIFY (0)

Anonymous Coward | more than 3 years ago | (#34746120)

That isn't what Intel claim it does and it isn't what my system does. Modern systems are asymmetric by design to increase performance (NUMA, per processor caches, per processor branch predictors).
Just accept that we haven't had deterministic computers for a long while and clock boosting techniques are unavoidable to get most performance/(price+power).

-- Megol

Re:FIFY (1)

MechaStreisand (585905) | more than 3 years ago | (#34747360)

Ideally, there's no such thing as tasks which can't scale to multiple threads?

Re:FIFY (0)

Anonymous Coward | more than 3 years ago | (#34749330)

Where is this 'Ideally' you speak of, I'd like to move there as the rent is too damn high here in Reality.

Re:FIFY (0)

Anonymous Coward | more than 3 years ago | (#34749836)

No, Turbo Boost does not dumb down your processor. It turns an SMP system into an asymmetric multiprocessor system on demand. If you are running a single CPU-bound thread (a pretty common workload), then it overclocks one core and underclocks the others so that you get better single-thread performance but don't overstep the CPU's thermal dissipation limit.

That's a somewhat misleading description of what goes on. Turbo Boost "overclocks" (not a good word to describe it since Intel actually characterizes all possible Turbo speeds and guarantees correct operation) all active cores equally.

The key word there is "active". Turbo Boost capable CPUs also have a second key feature: whole-core power gating. Intel integrates a large ultra low loss power FET on die per core. When a core is idle, this allows the power controller to completely shut it down so that literally no power flows to it (in modern logic processes, merely stopping the clock would still leave the core burning a lot of power through leakage). When cores are shut down, that gives the Turbo Boost controller more thermal headroom to boost the remaining cores.

Because TB does work by thermal headroom, and because some types of code consume more power than others, it's possible for TB to boost your CPU even when all cores are active, so long as you're running "cool" code. The amount of boost is limited compared to the amount allowed when one or more cores are shut down.

intel also needs more PCI-E lanes as just X16 for (1)

Joe The Dragon (967727) | more than 3 years ago | (#34743404)

intel also needs more PCI-E lanes as just X16 for video is not ok with Light peak , USB 3 , cable card tuners and more on the way that needs more then just a pci-e x1 slot should have 20 so you can have x16 video and x4 for a add in card.

Re:intel also needs more PCI-E lanes as just X16 f (1)

kevmatic (1133523) | more than 3 years ago | (#34745246)

That's 16 lanes coming directly out of the CPU; the chipset also provides an additional 8 lanes.

This means that in order to get data from a discrete GPU to a PCIe lightpeak card will require a journey from the GPU, through the CPU PCIe lanes, through the CPU, down whatever they're calling the Frontside Bus this week, into the Chipset's PCIe controller, down those lanes and into the lightpeak card. I don't know if that will affect performance much.

Of course, I doubt we'll see GPU support for Lightpeak monitor connections OR Lightpeak monitors for at least a year after Lightpeak itself comes out, so its unlikely to see use this CPU generation.

or use a Voodoo 2 like Loopback Cable (1)

Joe The Dragon (967727) | more than 3 years ago | (#34749132)

or use a Voodoo 2 like Loopback Cable to get the video on the light peek bus.

Laptops with SandyBridge's launch? (1)

DoofusOfDeath (636671) | more than 3 years ago | (#34743508)

Anyone know whether or not Dell's M6xxx mobile workstation line will start offering Sandy Bridge processors on/around the official Sandy Bridge launch date?

I've read that Dell will roll out a new product (M6600) with Sandy Bridge, but I don't know if it's happening this week, or something later.

Re:Laptops with SandyBridge's launch? (0)

Anonymous Coward | more than 3 years ago | (#34744030)

I would e-mail dell customer service about it.

Re:Laptops with SandyBridge's launch? (1)

KZigurs (638781) | more than 3 years ago | (#34750210)

And they will reply to you describing their great value deals of the week.

On die DRM (0)

Anonymous Coward | more than 3 years ago | (#34743690)

Nobody will comment about the deal with Intel and MAFIAA to put DRM in Sandy Bridge?

Slow day.

Re:On die DRM (2)

RightSaidFred99 (874576) | more than 3 years ago | (#34748478)

Nobody with any sense gives a shit - if you don't like the DRM features, don't use programs that require them.

Re:On die DRM (1)

KZigurs (638781) | more than 3 years ago | (#34750216)

and who cares? Remember PIII?

Still no decent low cost computer (1, Interesting)

InsaneProcessor (869563) | more than 3 years ago | (#34743716)

And we still have no decent quality low cost mobile computer. Don't try to tell me the Atom is decent or low cost!

Price vs Performance (4, Insightful)

cryptoluddite (658517) | more than 3 years ago | (#34743806)

Also interestingly, the most expensive desktop part will start at $317, putting the screws to AMD yet again.

When has Intel ever lowered prices without needing to?

It's more likely that instead of putting the screws to AMD, Intel is worried about Bobcat and Bulldozer coming out pretty soon and factoring that into their prices (to gain market share before AMD chips get out). On merit Bobcat CPUs should dominate the low-end laptop/netbook market with low power use and real integrated graphics. Bulldozer should do well in the high-end server market again with low-power and more cores... basically where intel CPUs have hyperthreading, Bulldozer has another actual core (for integer instructions).

Re:Price vs Performance (2)

Kjella (173770) | more than 3 years ago | (#34744784)

When has Intel ever lowered prices without needing to?

Intel will act rather aggressively to deny AMD any high margin CPUs if they can, they know keeping AMD cash starved is the best guarantee for their continued dominance while Intel has plenty markets to get their margins. Intel is way ahead of the game at this point, they know AMD has some launches soon and for one they empty the market of people looking for a new CPU soon and second they make sure that instead of praising reviews and reports that AMD force Intel to price cuts they instead get "too little, too late" reviews. This is the first volley, I expect another volley after the AMD launches with new $500-1000 processors just to rub in their performance dominance and get another round of positive reviews. Intel has AMD way backed up into the "value" corner, they're not getting out that easily...

Re:Price vs Performance (1)

should_be_linear (779431) | more than 3 years ago | (#34744838)

No, it is mobile computing that worries Intel. ARM-based devices are set to take more then 10% of all US Internet browsing in 2011 with no signs of slowing down anytime soon. Intel is in the same situation as Microsoft: cutting prices to keep market share. AMD missed opportunity for mobile computing, they could use their CPU and GPU expertise and create chips for killing phones/pads. Instead, they wasted all their time on same-day compatibility with yet another DirectX version(s). Clearly, winners are consumers buying new computer/pad/phone this year. With so many players involved, prices will collapse.

Re:Price vs Performance (1)

AvitarX (172628) | more than 3 years ago | (#34747280)

And has a very compelling mobile platform coming.

9 Watt, better than atom by far, and dx11 incremental embedded improvement. And an 18 Watt part competitive with current generation mid range desktops.

I know I would love an Ontario pad, netbook, or even laptop. Still more thank I'd be willing to use power wise in a phone.

Re:Price vs Performance (1)

Rudeboy777 (214749) | more than 3 years ago | (#34750408)

AMD missed opportunity for mobile computing, they could use their CPU and GPU expertise and create chips for killing phones/pads.

They only missed the opportunity in the sense that the Brazos platform is not quite out yet. This combination will kill pads and the derivatives may kill phones.

Beware if you want to install Linux! (0, Troll)

IYagami (136831) | more than 3 years ago | (#34743856)

According to semiaccurate:

" If you try to use Sandy Bridge under Linux, it is simply broken. We tried to test an Intel DH67BL (Bearup Lake) with 2GB of Kingston HyperX DDR3, an Intel 32GB SLC SSD, and a ThermalTake Toughpower 550W PSU. At first we tried to install vanilla Ubuntu 10.10/AMD64 from a Kingston Datatraveler Ultimate 32GB USB3 stick. The idea was that it would speed things up significantly on install.
That's when the crippling bug surfaced. It seems the USB3 ports on the Intel DH67BL don't want to work. Ubuntu 10.10 installs fail during the install, no fix was found. Plug the same stick into a USB2 port, and it works fine. Alternately, install from a USB2 stick on a USB3 port, and things work fine."

source: http://semiaccurate.com/2011/01/02/sandy-bridge-biggest-disapointment-year/ [semiaccurate.com]

Re:Beware if you want to install Linux! (1)

DurendalMac (736637) | more than 3 years ago | (#34744084)

So Ubuntu puts out a bugfix and everything works dandy again. Graphics drivers improve under Linux in a month or so. What a shocker.

It's not like Charlie hasn't shown himself to be a complete and total ass before, as Intel pointed out at the end of the article. Demerijan is a whiny tosser who has been a spouter of bullshit for quite some time now.

Re:Beware if you want to install Linux! (3, Informative)

CajunArson (465943) | more than 3 years ago | (#34744934)

Uh... so you install a several months old version of Linux on a brand new architecture and it doesn't work, therefore the architecture is "broken"????? There are fully 100% open source drivers available for Sandy Bridge RIGHT NOW. Phoronix [phoronix.com] (usually the purveyor of sensationalism but a voice of reason in this case) goes out of its way to detail exactly what you need to run Sandy Bridge with 100% open source code. Now... is it 100% released yet? No, but at the same time, you have to remember that SB isn't even officially for sale yet. It WILL be fully released in the next round of distro updates, and you can get all the stuff to run it right now if you are truly as l33t as you think you are. I'm just sitting back and waiting for the AMD fanboys to scream about how AMD is so wonderful and all AMD graphics work perfectly in Linux when someone gets GLX gears running on a 6000 series part in 6 months......

Re:Beware if you want to install Linux! (1)

RightSaidFred99 (874576) | more than 3 years ago | (#34745252)

Semiaccurate.. hmm. Seems a fitting name for such idiocy.

Sandy bridge does not have USB3. That motherboard may have a USB3 chip onboard, but it has nothing to do with Sandy Bridge other than the lame fact the SB chipsets don't have USB3 yet.

Re:Beware if you want to install Linux! (1)

KZigurs (638781) | more than 3 years ago | (#34750236)

you mean like rather just announced and released hardware risks to break compatibility with what - 18+ months old platform? Shocking!

Fp Gnaa (-1)

Anonymous Coward | more than 3 years ago | (#34744274)

the BSD license, Has signif1cantly God, let's fucking a fuul-time GNAA the deal with you

Euler3D -- colmpiler? (1)

coats (1068) | more than 3 years ago | (#34744922)

There's one benchmark here that could reasonably be compiled on a processor specific basis, to show what the processor really can do (as opposed to all the other benchmarks, which are based on proprietary least-common-denominator executables: Euler3D.

And there are processor specific enhancements that could have great influence (150%?) on this code's performance... As it happens, this benchmark the one of greatest professional interest to me, anyway :-)

I'd really like to know how its performance would compare with Gulftown, if the benchmark were compiled for SandyBridge, with the latest (SandyBridge-supporting) edition of ifort.

Treacherous Computing (2)

Skjellifetti (561341) | more than 3 years ago | (#34745998)

Sandy Bridge [bloomberg.com] implements treacherous computing [gnu.org] .

Re:Treacherous Computing (1)

RightSaidFred99 (874576) | more than 3 years ago | (#34748518)

Don't use the DRM features or buy media that requires it. Seems kind of a silly thing to whine about, doesn't it?

Wow "Turbo Boost Mode" (1)

shockbeton (669384) | more than 3 years ago | (#34747692)

How do I turn it off? I'm blinded by the SPEED! Is Intel stealing 15 year old marketing shtick from Gateway?

Coming soon.... (0)

Anonymous Coward | more than 3 years ago | (#34749432)

Sandy Bridge is expected to be implemented in the Sandy Vagina wearable platform.

Turbo boost (1)

yelvington (8169) | more than 3 years ago | (#34750474)

" a new Turbo Boost mode that increases clock speeds dynamically "

Dynamically? I want my Turbo Boost button [codinghorror.com] back. 66 megahertz or bust!

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>