Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Next Gen Intel CPUs Move To Yet Another Socket

CmdrTaco posted more than 4 years ago | from the consistency-is-stupid dept.

Intel 254

mr_sifter writes "According to a leaked roadmap, next year we'll be saying hello to LGA1155. The socket is 1-pin different from the current LGA1156 socket Core i3, i5, and some i7s use. Sandy Bridge CPUs will be based on the current 32nm, second-generation High-k metal gate manufacturing process. All LGA1155 CPUs will have integrated graphics built into the core instead of a separate chip. This is an upgrade from the current IGP, PCI Express controller and memory controller in Clarkdale CPUs. which is manufactured on the older 45nm process in a separate die (but still slapped together the same package). This should improve performance, as all the controllers will be in one die, like existing LGA1366 CPUs."

cancel ×

254 comments

Sigh (0)

Anonymous Coward | more than 4 years ago | (#31922584)

Intel loves to rape consumers of their moneys. I'm hoping hp's crossbar latch technology gets out soon.

Re:Sigh (4, Interesting)

MrNaz (730548) | more than 4 years ago | (#31922704)

There's always AMD's Fusion on the horizon. If they can execute well on that they have a chance to do what they did with the Athlon. Intel has yet to demonstrate that they actually have GPU tech that can compete with nVidia and ATI in this space. I really hope they do, Intel has had too long at the top of the market and they're getting all monopolistic again.

Re:Sigh (2, Interesting)

MrNaz (730548) | more than 4 years ago | (#31922732)

As in, I hope AMD can execute, not I hope Intel have tech that can compete with nVidia and ATI. The former would lead to better competition, the latter would give the monopolist more power.

That'll teach me to not preview.

Re:Sigh (3, Insightful)

MrNaz (730548) | more than 4 years ago | (#31922752)

Gah! I meant "that'll teach me to preview".
Someone pass me a mallet. My head seems to need a little percussive maintenance.

Re:Sigh (-1, Flamebait)

HarrySquatter (1698416) | more than 4 years ago | (#31923126)

I really hope they do, Intel has had too long at the top of the market and they're getting all monopolistic again.

How dare Intel make better products than AMD and thusly gain market share back! How evil of them!

Re:Sigh (1)

caerwyn (38056) | more than 4 years ago | (#31923342)

I don't think the GP is upset at *Intel* in this regard; I think it's more a perfectly realistic consumer complaint: "I wish there were more competition in this space because that would be better for me as the consumer." AMD dropped the ball pretty badly after a very strong run with earlier Athalons. It'd be great to see them get back into the game and really help push things along again.

Re:Sigh (1)

sznupi (719324) | more than 4 years ago | (#31923532)

AMD dropped the ball pretty badly after a very strong run with earlier Athlons. It'd be great to see them get back into the game

(corrected one product name...)

That's not so simple. How much of "AMD dropping the ball" was because if illegal, anticompetitive practices of Intel? Practices which, essentially, robbed AMD from money needed for aggresive R&D and fab expansion.

Re:Sigh (2, Informative)

sznupi (719324) | more than 4 years ago | (#31923474)

Intel, through illegal practises, prevented AMD from benefiting fully from their lead with K7 and early K8 Athlons. This illegally rerouted money weakened AMD R&D and fabs, while strenghtening Intel ones at the same time.

Re:Sigh (2, Insightful)

MoldySpore (1280634) | more than 4 years ago | (#31923272)

I really hope that AMD gets back on top and can compete with Intel on the top-level CPUs again. I am tired of the Intel fanboy's crapping all over AMD for the last few years, and really the industry NEEDS AMD to get back on top and help drive the price of these Intel chips down. The price gap is so huge between AMD and Intel that it makes building a top of the line Intel machine very daunting for us working-class enthusiasts and system builders.

Thankfully AMD's new hexacores will work in AM3 sockets so a motherboard upgrade isn't necessary at least for the Phenom II X6's. To me that is a big deal. I think it will be for a lot of others as well.

Re:Sigh (1)

bberens (965711) | more than 4 years ago | (#31923824)

Maybe it's changed in the last couple of years but my understanding was that AMD was still the place to go for database type machines because of the bus speed but Intel was the way to go for number crunching app machines. At the time I did the research and tried to explain that to my manager but it didn't matter because the Core 2 Duo was kicking the pants off AMD on the personal computer so Intel was faster and that's what we were getting for all the machines.

GOOD GOLLY MISS MIOLLY !! (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31922590)

Holy smokes, this is great news for all !!

This socket goes to 1155 (1)

jayhawk88 (160512) | more than 4 years ago | (#31922598)

Well, it's one louder...err faster, isn't it?

Re:This socket goes to 1155 (0)

Anonymous Coward | more than 4 years ago | (#31922694)

but this one goes to 1156?

Hopefully my eye socket (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31922600)

I want to be the terminator.

Integrated graphics in the CPU? (3, Interesting)

Lord Lode (1290856) | more than 4 years ago | (#31922624)

I can see that integrated graphics in a CPU can be handy for some applications, like low-power mobile stuff and such.

But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?

Re:Integrated graphics in the CPU? (0)

Anonymous Coward | more than 4 years ago | (#31922682)

Who needs a proper graphics card these days?
Only people who need real-time high-def geometry rendering. They are free to chose a CPU without an on-die GPU.
Everyone else is happy they won't need additional cooling infrastructure on their mainboard.

Real-time high-def geometry rendering (1)

tepples (727027) | more than 4 years ago | (#31923032)

Who needs a proper graphics card these days?
Only people who need real-time high-def geometry rendering.

More people will need this than you might think. Let's look at each piece of your claim:

  • Real-time: Graphical user interfaces must respond instantly to the user's commands. Newer systems have added window entry/exit effects to clarify the relationship among various items on the screen.
  • High-def: PC GUIs have met the definition of high-definition video since the 1990s, when XGA resolution (1024x768px) became common.
  • Geometry: I'll admit that most office and web applications currently use much simpler geometry than a typical Xbox 360-class 3D game. But as IE 9 becomes popular, SVG and <canvas> will finally become viable, and even font rasterization will move to the GPU.

Re:Real-time high-def geometry rendering (0)

Anonymous Coward | more than 4 years ago | (#31923318)

Font rasterization has been GPU accelerated in various forms in Windows for at least a decade now. This is why certain effects are impossible with the native Windows font renderer.

Re:Real-time high-def geometry rendering (1)

T-Bone-T (1048702) | more than 4 years ago | (#31923756)

I'm using a desktop that I recently built with a Core i3-530 and the built-in graphics are quite acceptable, even at 1600x900(the monitor was free so I won't complain about the odd resolution). The only place they suffer is in high-performance areas like games. The IGP is designed to process Full HD video.

Re:Real-time high-def geometry rendering (2, Insightful)

Firethorn (177587) | more than 4 years ago | (#31923828)

More people will need this than you might think. Let's look at each piece of your claim:

I think that the issue here is where you place the line on a 'proper' graphics card.

By that I mean that today even integrated video cards are easily able to keep up with GUIs, play even blue-ray movies, etc...

I'm not sure SVG/Canvas, rasterization will really bog down modern integrated graphic engines. Or if it doesn't support it, it'll fall back to the CPU, and assuming you're not doing anything too CPU intensive at that moment, it won't matter. You don't need a 5870 to run Office or IE.

Re:Integrated graphics in the CPU? (1)

ryantmer (1748734) | more than 4 years ago | (#31923370)

Who needs a proper graphics card these days? Only people who need real-time high-def geometry rendering. They are free to chose a CPU without an on-die GPU. Everyone else is happy they won't need additional cooling infrastructure on their mainboard.

Have you ever tried playing anything but Farmville on integrated graphics? Apparently the Clarkdale CPU/GPUs are only about 1.5x more powerful than integrated at this point. While I realize the Sandy Bridge (who comes up with these names, anyway?) should improve on this, I'm guessing that even playing TF2 on one of these things would be rather unsatisfactory...

Re:Integrated graphics in the CPU? (0)

Anonymous Coward | more than 4 years ago | (#31922692)

the cpu's are already as big as they need to be. A lot of the space is devoted to useless cache for the sake of marketing.

Re:Integrated graphics in the CPU? (2, Interesting)

marcansoft (727665) | more than 4 years ago | (#31922838)

Um, no. Cache is very important, especially with 64-bit code. In fact, x86 is a terribly die-area-inefficient architecture; we'd be a lot better off with a modern RISC, opening up space for more cache.

Re:Integrated graphics in the CPU? (3, Informative)

msgmonkey (599753) | more than 4 years ago | (#31923082)

Your point would have been valid 10 years ago but the die area used for the CISC instruction decoder on a modern x86 processor is negligible. Infact the x86 instruction set is more compact than a pure RISC cpu so you can fit more instructions into the instruction cache (ARM processors have a THUMB mode with more compact 16bit instructions because of this).

Re:Integrated graphics in the CPU? (5, Informative)

marcansoft (727665) | more than 4 years ago | (#31923356)

The key is modern RISC, not RISC. x86 is horribly inefficient. I'm not talking about the instruction decoder, I'm talking about the instruction semantics. x86 was never designed for today's high-performance CPUs, and the result is that the instruction set basically allows the programmer to do anything they want, even if it goes against modern CPU design optimizations. This forces the CPU to devote a large amount of die area to workaround logic that detects the thousands of possible dirty tricks that a programmer might use which are allowed by the ISA. For example, every modern RISC requires that the programmer issue cache flush instructions when modifying executable code. This is common sense. x86 doesn't, which means there needs to be a large blob of logic checking for whether the data you just touched happens to be inside your code cache too. The fact that on x86 you can e.g. use one instruction to modify the next instruction in the pipeline is just so ridiculously horribly wrong it's not even funny. There are similar screw-ups related to e.g. the page tables. I can't even begin to imagine the pains that x86 CPU engineers have to go through.

You can make an x86 chip reasonably small and very slow, or very large and very fast. x86 doesn't let you have it both ways to any reasonable degree.

Re:Integrated graphics in the CPU? (1)

amorsen (7485) | more than 4 years ago | (#31923822)

The neat thing about the x86 architecture is that it has forced the chip designers to be really clever. E.g. the register limitations has forced them to find ways to make level 1 cache really fast; you'll be hard pressed to find non-x86 chips with faster level 1 cache. Similarly, the system call latency is fantastic. Most importantly the (quite) strong memory ordering provided by x86 means that x86 is pretty much unmatched when it comes to inter-CPU communication. Look at the hoops e.g. PA-RISC goes through to handle SMP in Linux, then compare with x86 where some of the memory barriers are even no-ops and just get turned into compiler barriers. RISC designers believe that cache coherency is expensive and that programmers should be aware of that cost. x86 has proven that it doesn't have to be expensive -- except in chip real estate.

Anyway, if you can live with somewhat sub-par performance, Larrabee shows that it's still possible to get decent performance with a smaller chip. Not quite as small as ARM or MIPS though, so a lot of the embedded space is closed to x86.

Re:Integrated graphics in the CPU? (1)

Calinous (985536) | more than 4 years ago | (#31923114)

RISC typically needs more RAM than CISC (and it seems less than 10% of the die area is devoted to x86 instruction decoding, at least in high-performance processors), so you'll trade the space for more cache for the need for more main memory.

Re:Integrated graphics in the CPU? (1)

marcansoft (727665) | more than 4 years ago | (#31923380)

I love how everyone jumped so quickly on the instruction decoding bandwagon. Of course instruction decoding is cheap these days, even for x86. The problem isn't decoding, it's the huge amount of dirty things that instructions can potentially do after being decoded. Things that go against modern high-performance CPU design principles.

Re:Integrated graphics in the CPU? (2, Interesting)

HarrySquatter (1698416) | more than 4 years ago | (#31923184)

In fact, x86 is a terribly die-area-inefficient architecture; we'd be a lot better off with a modern RISC, opening up space for more cache.

Is this ignoring the fact that most of Intel's chips for many years have basically been RISC processors with an x86 translation unit?

Re:Integrated graphics in the CPU? (0)

Anonymous Coward | more than 4 years ago | (#31923292)

I had this conversation with myself a while back. Would I rather have more cache generating more heat? Or overclock and make up the difference? I think 6MB to 12MB is like a 2% improvement in most benchmarks. I chose to get a 2MB pentium dual core for $70 and overclock it to 3.5GHz. It runs cool, too.

Re:Integrated graphics in the CPU? (1)

Farmer Tim (530755) | more than 4 years ago | (#31922756)

...couldn't that space in the CPU be used for better things than a redundant graphics circuit?

At first I read that as "retarded graphics circuit". Still made perfect sense...

Re:Integrated graphics in the CPU? (1)

TheLink (130905) | more than 4 years ago | (#31922760)

Basically they've run out of ideas on how to use those billions of transistors to make things faster or better.

It's either:
1) Another CPU core
2) Yet more cache.

And now GPUs...

Too bad Intel can't make great GPUs.

Re:Integrated graphics in the CPU? (1)

maxume (22995) | more than 4 years ago | (#31923502)

It's (mostly) useless for gamers, but one thing Intel does do is make excellent drivers for their graphics chips (at least, that is my experience under XP).

Re:Integrated graphics in the CPU? (1)

Thanshin (1188877) | more than 4 years ago | (#31922806)

Not if you convince "proper graphics card" to see it all as a CPU integrated in their graphics card.

I don't think it'd be very hard right now to convince an alienware buyer to uy a computer that's essentially a graphics card with all the rest integrated around it. Except, maybe, the hard drive. And even there you could argue "it has a SSD for you to install one or two games at a time. You can buy a standard HD for the rest."

The only thing to leave outside would have to be the mouse (some elite pro killer razer) and the keyboard (a pro ergonomic strafer pro elite pro fatal1ty). And the headset.

Re:Integrated graphics in the CPU? (0)

Anonymous Coward | more than 4 years ago | (#31922830)

If you could do a virtual sli... I wouldn't mind as long as it is usable.

to bad it's the same gma crap that amd has a bette (1)

Joe The Dragon (967727) | more than 4 years ago | (#31923018)

to bad it's the same gma crap that amd has a better on board chip and plane to work on getting in the cpu + letting it boast a add in ati card as well. what will intel card do just shut down when a better card in installed?

Re:to bad it's the same gma crap that amd has a be (2, Interesting)

Calinous (985536) | more than 4 years ago | (#31923228)

http://www.anandtech.com/show/2972/the-rest-of-clarkdale-intel-s-pentium-g6950-core-i5-650-660-670-reviewed/2 [anandtech.com]

i5-661 (with the fastest on-package graphics) is performance-competitive with AMD's latest integrated graphics. The slower on-package GPU from Intel are behind, but not by much. Nothing Intel can't solve in its next processor (especially as AMD did not increase its IGP performance)

Re:Integrated graphics in the CPU? (1)

Rogerborg (306625) | more than 4 years ago | (#31923122)

Heck, I remember "integrated graphics" the first time round. It was called "using the CPU to do graphics", and it was good enough for us to render buggy whips in 2D, sometimes even 2.5D.

Also, what's with kids these days playing their hippety-hop music way too loud using integrated chips rather than a good old ISA SoundBlaster 16?

Re:Integrated graphics in the CPU? (1)

Targon (17348) | more than 4 years ago | (#31923312)

Low end systems become even cheaper to produce when the chipset on the motherboard does not need to include graphics support. Also, if your add-in video card fails, you can always run off integrated until you can replace it. You are right about a 'proper' video card being a better choice overall, but if you look at those $400 to $500 computer towers being sold all over the place, not a single one has a dedicated video card.

Now, AMD is moving forward with their Fusion project, which will add a GPU to some processors, but since AMD has Hybrid CrossfireX, the GPU portion on the CPU COULD be made to work with an add-in Radeon video card for extra graphics processing power. Intel on the other hand does not make stand-alone graphics products, and they also have zero experience with multi-GPU technologies to do something similar. NVIDIA has SLI, but without a CPU, they are not a part of this discussion.

The big problem that Intel has is that their graphics technology is sub-standard, so at best, it is still about 'integrated graphics' for the performance levels. Intel also loves forcing people to upgrade motherboards CONTINUALLY, because Intel isn't just the CPU manufacturer, they are the chipset manufacturer as well. In general, Intel may have a performance lead at the high end, but due to the long-term costs, those looking for a mid-range system can still find AMD products worth using. The fact that you can start with a basic system with a dual core processor, and with just a BIOS update, you will be able to drop in a six code Phenom 2 without ANY other changes.

Re:Integrated graphics in the CPU? (1)

Firethorn (177587) | more than 4 years ago | (#31923442)

But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?

Don't look at the PC enthusiast/gamer market. Look at the desktop PC for basic business use. Cost is much more king there, as long as performance is acceptable. You gotta cut a lot of costs if you want to be able to slap down a whole PC for less than $200.

I wouldn't be surprised if in a couple more generations we're looking back at 'system on a chip' designs. No northbridge, southbridge, video controller, etc... Just a central chip on a board with power and interface leads.

Re:Integrated graphics in the CPU? (1)

MoldySpore (1280634) | more than 4 years ago | (#31923528)

Integrated graphics in the CPU are also worthless if you don't have a motherboard with native video outputs, unless I have read the wrong info about this tech.

What GMA stands for (1, Funny)

tepples (727027) | more than 4 years ago | (#31922636)

All LGA1155 CPUs will have integrated graphics built into the core

Will the new integrated GPU have performance even on par with a Wii's GPU, or is it the same GMA (i.e. "Graphics My Ass") that's been built into Intel boards for years?

Re:What GMA stands for (1)

Calinous (985536) | more than 4 years ago | (#31922856)

If it's the graphic chip from the i5-661, then it's competitive with the AMD's IGP (AMD might have better drivers though)

Re:What GMA stands for (1)

bdenton42 (1313735) | more than 4 years ago | (#31923140)

It will probably be junk like usual. If they released on board graphics on par with something like a 9800 GT it would crush NVidia and AMD/ATI as there probably isn't enough of a market above that to keep them operating.

Then there will be Federal investigations and anti-trust lawsuits... they just don't need that kind of trouble.

Re:What GMA stands for (1)

LinuxIsGarbage (1658307) | more than 4 years ago | (#31923608)

Will the new integrated GPU have performance even on par with a Wii's GPU, or is it the same GMA (i.e. "Graphics My Ass") that's been built into Intel boards for years?

I always thought it was Garbage Media Adapter.

As long as we get (0)

Anonymous Coward | more than 4 years ago | (#31922638)

Lightpeak

Nah, great (1)

vikingpower (768921) | more than 4 years ago | (#31922642)

I just bought a computer with an 1156 socket on its motherboard, which means that THAT computer will be locked to i5 / i7 in a few years. Hmmm.

Re:Nah, great (1)

IBBoard (1128019) | more than 4 years ago | (#31923394)

Damnit, I just upgraded my old Athlon 64 3500+ with a nice new Core i5 750 as well. £320 for processor, mobo and 4GB memory. Good job I was hoping for it to last for a while, cos it sure looks like Intel don't want me to just upgrade my processor when it gets to be lacking.

Figures... (1)

Captain Centropyge (1245886) | more than 4 years ago | (#31922644)

"Yes, let's force users to upgrade all their hardware when they want a new CPU! Show me the money!"

Re:Figures... (3, Interesting)

TheKidWho (705796) | more than 4 years ago | (#31922754)

You upgrade the CPU/Motherboard/RAM. Big woop.

You would need a new motherboard regardless if they changed the socket or not. You would also need new RAM since the RAM requires lower operating voltages.

They probably did this so you don't try to plug in the new CPU on your old motherboard thinking it was a straight upgrade when it requires different circuitry.

Re:Figures... (1)

sznupi (719324) | more than 4 years ago | (#31922966)

One would think somebody at Intel noticed by now that it's good to write motherboard specs with bigger headroom for lower voltages...

Of course, they simply don't want it; Intel chipsets bring quite a lot money, too.

Re:Figures... (1)

maxume (22995) | more than 4 years ago | (#31923588)

The set of users that upgrade anything is pretty small. The set of users that upgrade CPUs is even smaller.

(so maybe they are doing it to force more sales, but they are pissing in the wind if they are)

Re:Figures... (1)

sznupi (719324) | more than 4 years ago | (#31923732)

That's to be expected if for many systems sensible upgrades are blocked. Anyway, if the group willing to upgrade is so small...why Intel blocks it?

Re:Figures... (2, Insightful)

Captain Centropyge (1245886) | more than 4 years ago | (#31922970)

So, there's no way to do this using the current socket/motherboard? My guess is that they do this purposely (at least some of the time) so that users need new hardware for their upgrades. It generates more revenue. I work in the software resale industry and the software vendors pull this crap all the time. (e.g. no backward compatibility forces more users to upgrade so that they can all work together)

Re:Figures... (1)

TheKidWho (705796) | more than 4 years ago | (#31923064)

Not without a new chipset.

I don't get why socket compatibility matters when you need a new chipset anyways.

I believe the new processors are using PCI Express 3.0 and require more lanes/copper as well.

Re:Figures... (3, Informative)

sznupi (719324) | more than 4 years ago | (#31923242)

Only because Intel chooses to obsolete old chipsets (or, more preciselly, arbitrarily changes bus specs on new motherboards - I've seen an ASRock one for C2D with i865). AMD somehow manages to keep latest versions of their CPU interconnect backwards compatible...you really want to say Intel isn't capable of doing so? (especially if Intel simply uses PCI Express for those chips, which is explicitly backwards compatible)

Re:Figures... (1)

Captain Centropyge (1245886) | more than 4 years ago | (#31923244)

I guess this tends to highlight my lack of knowledge around hardware and chipsets. I'll concede this round to you. But it would actually save Intel (or their partners) some cash by not having to come up with a new socket every time they develop new CPUs. They need new molds, tooling, etc. to manufacture even just the socket itself. I guess this line of thought doesn't work so well when dealing with this industry.

Re:Figures... (1)

ircmaxell (1117387) | more than 4 years ago | (#31923414)

Don't concede. They wouldn't need a new CPU if there was intelligence in the design. CPUs should not be talking directly to anything but memory. All other communication (other processors, PCIe, South Bridge, etc) should be done via a point to point protocol. So then the only thing tying a specific CPU to a specific mobo or socket would be the memory technology. The graphics communication could talk to a PCIe device for the display driving (for the actual conversion of signal to DVI). So a legacy mobo could simply use a dead simple PCIe device to do the translation. A newer one could build that device in. But electrically there should be no need for a new motherboard (since it's communicating out via the point to point protocol). The only thing that's really tying the CPU to a specific mobo/socket would be the memory technology (since the memory controller is on CPU, if a CPU has a DDR2 controller it can't be used in a DDR3 mobo and vise verse)... Intel does this because it can make more money on it. That's it... There's no other reason with today's technology to bind a particular CPU generation so tightly to a motherboard or socket...

Re:Figures... (1)

cynyr (703126) | more than 4 years ago | (#31923808)

see AMD's AM2+/3 how many CPUS fit in those? i bet you can even drop a newer 6 core AM3 chips into a AM2+ and get at least 4 of the cores, even thin all that may be needed is a bios update.

Re:Figures... (1)

ircmaxell (1117387) | more than 4 years ago | (#31923060)

You would need a new motherboard regardless if they changed the socket or not.

Ummm, why? You can upgrade the CPU on an AM2/AM2+ motherboard with at most a flash of the bios. And the AM2/+ CPUs are typically backwards compatible (a AM2+ will run on an AM2, but with reduced functionality). So that AM2 board you purchased 4 years ago is still compatible with the latest processors (but not with DDR3). Given AMD's track record with sockets, I'd be surprised if the AM3 gets "phased out" within the next 5 years (meaning that they stop releasing new processors for it)... So can you explain to me why Intel comes out with a new socket every year (seemingly at least)?

And you only need difference circuitry when the base implementation is flawed (Point to point protocols should be flexible for anything that you may want to be doing... Oh, and AMD has had integrated point to point for what, 9 years now? And Intel JUST introduced one maybe 6 months ago?)...

also amd HT is in all CPU's unlike Intel that only (1)

Joe The Dragon (967727) | more than 4 years ago | (#31923320)

also amd HT is in all CPU's unlike Intel that only has there in high end cpus.

so intel low end cpu are stuck with low pci-e lanes to the point where usb 3.0 can get in the way of x16 video cards make some boards use a pci-e switchs. and foreing apple to use core 2 in there 13" laptop just to get good video with needing to add full video chip + chipset.

Intel also uses this to lock out NVidia. They should put there new bus in the i3 i5 i7 (low end) and not crap GMA video + 16 pci-e lanes.

This why form day 1 apple should of used amd as 1st mac pro had less pci-e then the g5 had. If apple had used amd back then they could of had a system with a lot pci-e + maybe even a nvida sli chipset.

This is why apple is thinking about useing AMD.

Re:Figures... (2, Insightful)

Targon (17348) | more than 4 years ago | (#31923594)

There are different things to consider. On the AMD side of things, which everyone is using for comparison, you can often drop a new CPU into pretty much any AM2+ or AM3 motherboard with just a firmware update. You don't need to replace the RAM or motherboard, and you get the benefits of the new CPU. Going to a new MEMORY type would require a new motherboard, but with all of the new AMD processors, they support BOTH DDR2 and DDR3 memory.

There really is no good excuse for needing an all new chipset for each new generation of processor UNLESS there is a very fundamental change going on. The move from DDR1 to DDR2 to DDR3 for example might be required if the CPU does not support the older memory types(meaning you WANT to prevent users from using a chip in a system that will NEVER support it). Moving to an integrated memory controller, or adding additional pins for more banks of memory MIGHT be an excuse, though these days, extra "reserved" pins should have been put into the socket specification for this, with backwards compatibility so you could drop it into an older system with a degradation in performance(you lose the extra memory controller functionality). Adding graphics to the processor SHOULD work the same way, where the graphics on the processor would not be used if you plug the processor into a system without support for it. Again, looking forward at future needs when designing a new socket would make sense, so you just have a bunch of pins on the CPU that are "reserved" for future use, then, a new CPU would just switch off features the motherboard would not support.

AMD will be moving to a new socket type in the next year or so, due to things like adding the extra pins on the CPU for graphics, a 3, 4, 6, or whatever channel memory controller, or other functions being a part of their plans for the future. But, that next socket should be good to go for the next few generations after that, and for all we know, it may even support current DDR3 processors(DDR2 would probably be dropped since new motherboards would probably not have DDR2 memory support with the new socket).

So, if AMD can do it, people would expect that 'the leader' in the industry SHOULD be able to do it as well.

Re:Figures... (5, Informative)

zach_the_lizard (1317619) | more than 4 years ago | (#31922764)

You've had to do this for a while. Don't you remember having to get a new motherboard to use newer CPUs, even though they had the same socket? Yeah, I do. That was very confusing at times, and at least with a new socket, you will have a better chance of knowing what will / will not work.

Re:Figures... (1)

oldspewey (1303305) | more than 4 years ago | (#31922772)

This is pretty much what I do these days anyhow. I used to get excited about the idea of an "upgradeable" MB, but for the last bunch of years I have found that I just replace the whole machine (maybe minus the case and the stuff the plugs into the back) when the need/whim hits me.

Re:Figures... (1)

sznupi (719324) | more than 4 years ago | (#31923080)

But upgrading CPUs has become much more attractive lately - you can go, say, from cheap singlecore (AMD still has some singlecore Semprons; plus singlecore Athlon64 AM2 was quite popular for some time) in original, cheap machine to...also cheap now quadcore. Getting huge boost for very little money (you might also upgrade memory while ddr2 is still cheap)

Of course Intel simply wants you to buy more; chipsets are also quite lucrative after all (maybe pointing out it's a horrible waste would work with current enviromental sentiments?)

Re:Figures... (1)

oldspewey (1303305) | more than 4 years ago | (#31923432)

But I'm not likely to buy the cheapest single core CPU in the first place - I typically look for the fattest part of the performance/value curve (usually somewhere in the $150-$200 range).

As for environmental sentiments, it's worth noting that one approach means I end up with an old CPU and some sticks of RAM sitting in a cardboard box somewhere, while the other approach means I have a functioning MB plus all the subsystems ready to be repurposed or donated someplace where it will be of some value to somebody.

Trust me, I've spent a lot of years and money doing incremental upgrades to machines. I guess I've just arrived at a point where I can't be bothered hunting for that elusive 10% performance delta on an existing platform. I'll still add memory or hard drive capacity if necessary, but even that doesn't seem to be particularly frequent anymore.

Re:Figures... (1)

sznupi (719324) | more than 4 years ago | (#31923688)

But you were likely to buy it 3 years ago, in the form of AM2 Athlon64. Now, and still for quite some tinme to come, you can slap in a quadcore. Or if buying now some cheap CPU you would still be able to upgrade to significantly faster one later on...

You can also donate just the old CPU, btw...somebody will need it (old sticks of RAM? You just buy reasonable amount in two sticks at first and have two free slots for later expansion...yes, you have take note to get a motherboard with 4 RAM slots, but that won't really cost you any extra)

"Incremental" is not a good word in this case btw. Just two-step (plus you can get quite a bit more than 10%...)

Re:Figures... (1)

228e2 (934443) | more than 4 years ago | (#31923452)

QFT

Im still running my 2.2 Ghz Dual Core on a mobo i bought 6+ years ago . . i'll upgrade when that dies. Its perfectly fast enough for all my games and everything i need to do.

Re:Figures... (0)

Anonymous Coward | more than 4 years ago | (#31922850)

Do people really upgrade that often? Particularly these days? I went from a Northwood to a Conroe which continues to be an excellent processor. I might upgrade my video card or RAM once each during the lifetime of the machine, but that's it.

Re:Figures... (1)

bdenton42 (1313735) | more than 4 years ago | (#31923232)

I just upgraded a Northwood 2.8 to a Core 2 Duo 3.0. I used the same graphics card (8800 GT) in the new machine.

Night and day difference.

Re:Figures... (1)

LinuxIsGarbage (1658307) | more than 4 years ago | (#31923634)

I just upgraded a Northwood 2.8 to a Core 2 Duo 3.0. I used the same graphics card (8800 GT) in the new machine.

Night and day difference.

Power bill dropped that much?

next generation human beings, yet another chance? (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31922742)

likely some of the not so human doings will need to be undone first.

the corepirate nazi illuminati is always hunting that patch of red on almost everyones' neck. if they cannot find yours (greed, fear ego etc...) then you can go starve. that's their platform now. they do pull A LOT of major strings.

continued God's speed to you Mr. President.

never a better time for all of us to consult with/trust in our creators. the lights are coming up rapidly all over now. see you there?

you have the right to remain silent.

greed, fear & ego (in any order) are unprecedented evile's primary weapons. those, along with deception & coercion, helps most of us remain (unwittingly?) dependent on its' life0cidal hired goons' agenda. most of our dwindling resources are being squandered on the 'wars', & continuation of the billionerrors stock markup FraUD/pyramid schemes. nobody ever mentions the real long term costs of those debacles in both life & any notion of prosperity for us, or our children. not to mention the abuse of the consciences of those of us who still have one. see you on the other side of it. the lights are coming up all over now. the fairytail is winding down now. let your conscience be our guide. you can be more helpful than you might have imagined. we now have some choices. meanwhile; don't forget to get a little more oxygen on yOUR brain, & look up in the sky from time to time, starting early in the day. there's lots going on up there.

The current rate of extinction is around 10 to 100 times the usual background level, and has been elevated above the background level since the Pleistocene. The current extinction rate is more rapid than in any other extinction event in earth history, and 50% of species could be extinct by the end of this century. While the role of humans is unclear in the longer-term extinction pattern, it is clear that factors such as deforestation, habitat destruction, hunting, the introduction of non-native species, pollution and climate change have reduced biodiversity profoundly.' (wiki)

"I think the bottom line is, what kind of a world do you want to leave for your children," Andrew Smith, a professor in the Arizona State University School of Life Sciences, said in a telephone interview. "How impoverished we would be if we lost 25 percent of the world's mammals," said Smith, one of more than 100 co-authors of the report. "Within our lifetime hundreds of species could be lost as a result of our own actions, a frightening sign of what is happening to the ecosystems where they live," added Julia Marton-Lefevre, IUCN director general. "We must now set clear targets for the future to reverse this trend to ensure that our enduring legacy is not to wipe out many of our closest relatives."

"The wealth of the universe is for me. Every thing is explicable and practical for me .... I am defeated all the time; yet to victory I am born." --emerson

no need to confuse 'religion' with being a spiritual being. our soul purpose here is to care for one another. failing that, we're simply passing through (excess baggage) being distracted/consumed by the guaranteed to fail illusionary trappings of man'kind'. & recently (about 3000 years ago) it was determined that hoarding & excess by a few, resulted in negative consequences for all.

consult with/trust in yOUR creators. providing more than enough of everything for everyone (without any distracting/spiritdead personal gain motives), whilst badtolling unprecedented evile, using an unlimited supply of newclear power, since/until forever. see you there?

"If my people, which are called by my name, shall humble themselves, and pray, and seek my face, and turn from their wicked ways; then will I hear from heaven, and will forgive their sin, and will heal their land." )one does not need not to agree whois in charge to grasp the notion that there may be some genuine assistance available to us(

boeing, boeing, gone.

Re:next generation human beings, yet another chanc (-1, Offtopic)

TheKidWho (705796) | more than 4 years ago | (#31922918)

I don't get it, what does your post have to do with Boeing?

Planned Obsolesence (0)

Anonymous Coward | more than 4 years ago | (#31922770)

Can we get a Planned Obsolesence tag for slashdot stories please?

And yet,... (5, Informative)

Pojut (1027544) | more than 4 years ago | (#31922814)

...the AM2+/AM3 socket on my AMD board continues to be useful for new AMD CPUs literally years after I originally purchased it.

Re:And yet,... (0)

Anonymous Coward | more than 4 years ago | (#31923234)

Pointing out the obvious, but you had no reason to use the word literally in that sentence.

Re:And yet,... (1)

MadKeithV (102058) | more than 4 years ago | (#31923550)

Pointing out the obvious, it was obviously not necessary to point that out.

Re:And yet,... (1)

Kjella (173770) | more than 4 years ago | (#31923458)

Same socket, but can it run all the newer processors? That at least happened to be with a Shuttle I had that I thought about upgrading - for various reasons with the board it couldn't even with a BIOS upgrade. And there always seemed to be some sort of shift like AGP to PCIe, PATA to SATA, DDR2 to DDR3, USB 1.0 to 2.0 or some various other good reasons to upgrade anyway. Expansion cards are just silly expensive compared to motherboards, I'm guessing due to volume.

To take one example, any decent mobo today comes with 6+ SATA-II ports. As expansion cards the cheapest 4-port SATA-II controller is 439,- NOK. I can get a full new P43 motherboard with 6 channels for 499,- NOK. I guess it works nice if the CPU is the one and only thing you would like to upgrade. By the way, Intel really should put their 6-disk controller on an expansion card and kill the competition, don't understand why they don't.

Maybe for once things are finally stable enough to be worth it... but for my part, I've been burned many enough times that I don't even think about future compatibility - I only look at what I want here and now and if that hardware will last.

Re:And yet,... (1)

Pojut (1027544) | more than 4 years ago | (#31923718)

Yup, it can do it all...the ONLy thing i can't do is run DDR3 (it has four DDR2 slots), but other than that I can take care of all the new stuff (exceptin' USB3.0 and Sata6, of course...but not much on the market can do that yet either)

A win for AMD (5, Insightful)

Albanach (527650) | more than 4 years ago | (#31922894)

I can't understand why they would force another socket design on customers. I am using a four year old motherboard and recently replaced my AMD CPU with a current model. It was a drop in replacement. Sure I could get some benefits from a newer MB, but I can make the upgrade at a time of my choosing. I can spread the cost, get the big boost from the CPU now and get a smaller boost from a new MB in a year's time.

Board manufacturers have to spend money implementing the new socket. Retailers are stuck with old stock that no-one wants because a new socket is around the corner.

It raises prices and hurts the end user. Why are we still seeing this behavior?

Re:A win for AMD (1)

CrimsonAvenger (580665) | more than 4 years ago | (#31923026)

It raises prices and hurts the end user. Why are we still seeing this behavior?

I think you answered your own question in the first three words of the question....

Re:A win for AMD (1)

NevarMore (248971) | more than 4 years ago | (#31923040)

It raises prices and hurts the end user. Why are we still seeing this behavior?

It raises prices and helps Intel.

Re:A win for AMD (1)

Rogerborg (306625) | more than 4 years ago | (#31923182)

Uh, perhaps because renegades like me and thee - heck, we're probably filthy hackers, and we may even have links to organised crime - who upgrade our systems are an insignificantly small market, and Intel are happy to cede it to AMD in order to squeeze more profit out of the other 98% of their customers?

Re:A win for AMD (0)

Anonymous Coward | more than 4 years ago | (#31923398)

Because we live in a corporate world, and corporations like Intel see one thing and one thing only: Profit.

Re:A win for AMD (4, Insightful)

PhrstBrn (751463) | more than 4 years ago | (#31923522)

Because Intel sells motherboards and chipsets too. They don't want to sell you just a new processor, they want to sell you a new processor and a motherboard.

If Intel thought they could make more money by keeping their stuff backwards compatible, they would, but I'm sure the bean counters figured the amount of sales lost to AMD would be less than the profits they could make by forcing you to buy new motherboards too, and I would tend to agree with that.

I don't like it, I don't think it's good for consumers, but it makes sense from Intel's perspective.

Re:A win for AMD (1)

SharpFang (651121) | more than 4 years ago | (#31923538)

Board manufacturers get to push a new board model for people who want to upgrade the CPU.

I upgraded a CPU once. The CPU required a new motherboard. The new motherboard required new RAM and new gfx card. And the new components combined required a new PSU.

Pure business.

Oh, I prefer my toast in bites! (1)

angiasaa (758006) | more than 4 years ago | (#31922936)

Honestly, this would work for just the corporate users, and perhaps the oldies who don't go FPS'ing around the place. It would also do for mobile apps at a cinch. But for all practical purposes, it would seem that Intel is taking a step back into the netherrealm.

The additional surface area offered by a separate GFX chip allows it to cool faster. Frankly, I'd rather slap on a separate GFX card altogether, than waste transistors in my main processor for physics and pixel processing.

Keep the space for cache or add some more muscle to the chip, but don't go stuffing graphics or audio processing in there. The BUS speeds today are good enough to handle the stuff we throw at them with separate chips, so there!

If my Graphics chip blows, I can always replace it. If I somehow manage to fry my processor, I can replace that. Why replace everything if one goes kaput?

Re:Oh, I prefer my toast in bites! (1)

tepples (727027) | more than 4 years ago | (#31923100)

Frankly, I'd rather slap on a separate GFX card altogether, than waste transistors in my main processor for physics and pixel processing.

Or in a desktop PC, you could have a GMA running one monitor and a GeForce running the other.

Re:Oh, I prefer my toast in bites! (1)

bluefoxlucid (723572) | more than 4 years ago | (#31923274)

It'll never catch on. Well, it'll catch on fire.

Re:Oh, I prefer my toast in bites! (1)

Calinous (985536) | more than 4 years ago | (#31923326)

If your sound card blows, you could replace it. If your mainboard blows, you could replace it. Why replace everything when one goes kaput?
Just as discrete CDROM controllers (ISA) went the way of the dodo, just like IDE controller boards did the same, just like the network cards got integrated...

One freaking pin?! (3, Funny)

Hatta (162192) | more than 4 years ago | (#31923084)

How about you design the next socket with twice as many pins as you think you'll need? Then we won't run out and have to buy a whole new motherboard when we just want a faster CPU.

Re:One freaking pin?! (0)

Anonymous Coward | more than 4 years ago | (#31923174)

packaging cost?

Re:One freaking pin?! (3, Funny)

GungaDan (195739) | more than 4 years ago | (#31923246)

The new one has one FEWER pin than the current socket. So obviously next time they should either design one with a single removable pin, or no pins at all.

Duhhh! (1)

spammeister (586331) | more than 4 years ago | (#31923112)

Of course we all saw this coming. It's what Intel does time immemorial.

I'm sure all the intel and AMD fanboys will do what they do. They chose their camp, now they gotta take their lumps.

Memory type? (1)

nomaan (685185) | more than 4 years ago | (#31923168)

No mention on memory type to be used with he new chip so that might be the reason for a new socket/motherboard upgrade. Either way, this strategy of changing sockets every few years blows hard.

Paralleling (0)

Anonymous Coward | more than 4 years ago | (#31923226)

How long before before code kicks out to take advantage of the massively increased paralleling available on that core?

Get a Mac (1, Funny)

Anonymous Coward | more than 4 years ago | (#31923280)

No more CPU upgrades problems! /duck

(only half-kidding, though)

This simplifies cooling design so much! (1)

Kazymyr (190114) | more than 4 years ago | (#31923302)

I'm sure that combining the two biggest heat sources in a computer on the same die is a very well thought move. Especially for mobile versions. Yay.

The processor is only one part of performance (2, Interesting)

CFD339 (795926) | more than 4 years ago | (#31923364)

A large part of the performance gain in new generation processors is actually the combination of the processor and chipset. The core i5, core i7, etc. processors did away with a a separate memory controller -- that itself has been a huge power and speed advantage. Without upgrading the stuff supporting the chip, you don't get much benefit from an upgrade.

16 pci-e lanes to low when the chipset lacks usb 3 (1)

Joe The Dragon (967727) | more than 4 years ago | (#31923390)

16 pci-e lanes to low when the chipset lacks usb 3, and other things like sata 3.0 and other new buses fores MB makes to use switchs and other stuff to fit in video + sata 3.0 + usb 3.0 or cut down the video card to x8.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...