Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Takes Quad Core To the Desktop

Zonk posted more than 7 years ago | from the and-stay-there dept.

191

Rob writes to mention a Computer Business Review Online article about Intel's official launch of the Kentsfield chipset. Their Quad Core offering, Intel is claiming, is up to 80% faster than the dual-core Conroe released this past July. From the article: "Kentsfield, a 2.66GHz chip with a 1066MHz front-side bus, is more for computational-heavy usage, including digital content creation, engineering analysis, such as CAD, and actuarial and other financial applications. Steve Smith, director of operations for Intel digital enterprise group, claimed rendering is 58% faster for users building digital content creation systems, for video, photo editing or digital audio. In other words, Kentsfield is for high-end desktops or workstations only. For the average office worker who uses their PC for general productivity apps, such as communications and garden-variety computing, Smith recommended the Core 2 Duo from 'a price point and performance perspective.'"

cancel ×

191 comments

Sorry! There are no comments related to the filter you selected.

whats next (1)

Xamedes (843781) | more than 7 years ago | (#16836852)

quadrupoles chipset?

Re:whats next (2, Insightful)

eldavojohn (898314) | more than 7 years ago | (#16836942)

quadrupoles chipset?
Well, you were probably joking, but I'll open up a discussion to "whats next?" because this is something I feel the chip makers have kind of lost their way on.

First off, I'm not criticizing only AMD or Intel, I think they're both guilty of concentrating on perceived performance on desktop CPUs. They don't care how much power the chip consumes or how much heat it dissipates, they only care about what the average consumer sees as immediate performance. To me, performance can be multiple things and considering that you could fry an egg on my P4 no matter how big the heat sink is ... I don't think I'm going to get many years of use out of it. So heat & power consumption are steadily growing concerns of mine. I had an Athlon XP 2800 break after one year of use--last time I use the heat sink that comes with the processor!

What's next is simply that which is cheapest to research and develop while giving the user a higher number in some category that Dell or the sales people are sporting as bigger/better/faster/stronger. This is alright but I don't think the average consumer ever stops and asks themselves what the power consumption will be for such a CPU or what its expected time to failure is. I really hope that at some point, the chips are fast enough to run your basic operating system and the manufacturers split into two lines where one is aimed for longevity and power consumption (like some laptop model processors) instead of just speed.

Re:whats next (5, Insightful)

Divebus (860563) | more than 7 years ago | (#16836980)

After Effects Rendering. Final Cut Pro HD Rendering. Maya Rendering. Video Compression [Rendering}. If you've ever done what they target this processor for, you'll COMPLETELY appreciate any time NOT spent watching the growbar work. Bring it on, I've been waiting to replace several G5s doing this all day, every day.

Re:whats next (1)

Mr. Mindless (259403) | more than 7 years ago | (#16837188)

where's "+1 amen"?

Sitting there watching growbars... grow is never fun. With luck, the question will soon be "how do I throw data at this processor fast enough to keep the pipeline full?" Depending on the application, I wouldn't be surprised if storage speeds are going to be outpaced by this generation of chips when proforming operations on static data, and precaching data in RAM or other fast storage sooner is going to have a big effect on render speeds for stored data in applications like re compression of video.

Re:whats next (1)

jackbird (721605) | more than 7 years ago | (#16837846)

For the price, I'd rather have 2 dual Woodcrest 2.6 Mac Pros to get 8 cores. A lot of rendering-intensive apps have diminishing returns per core after 4-way SMP, anyhow.

Re:whats next (2, Interesting)

Divebus (860563) | more than 7 years ago | (#16838162)

For the price, I'd rather have 2 dual Woodcrest 2.6 Mac Pros to get 8 cores...

Ahhh... if only EVERYTHING was Xgrid aware, then that would work... I'd get a pile of Minis or the Xgrid agent for Linux. Hell, After Effects can't even use all the RAM in one machine, much less Xgrid.

More processors in one box is the only thing the current incarnation of After Effects can take advantage of... with diminishing returns on the processor count as you pointed out. Our Quad G5 is not twice as fast as a Dual G5 rendering After Effects - maybe 1.6 times faster.

However, the same Dual G5 2GHz was still 2 or 3 times faster than the Dual 2.4GHz Xeon under Windows doing the same After Effects work... and that software is optimized for Windows.

(oh god, I've just opened the flood gates for pimple faced gamers to flame me)

Re:whats next (1)

ChibiLZ (697816) | more than 7 years ago | (#16837078)

First off, I'm not criticizing only AMD or Intel, I think they're both guilty of concentrating on perceived performance on desktop CPUs. They don't care how much power the chip consumes or how much heat it dissipates, they only care about what the average consumer sees as immediate performance. To me, performance can be multiple things and considering that you could fry an egg on my P4 no matter how big the heat sink is ... I don't think I'm going to get many years of use out of it. So heat & power consumption are steadily growing concerns of mine. I had an Athlon XP 2800 break after one year of use--last time I use the heat sink that comes with the processor!
True, but I know at least with the Core 2 Duo Intel not only focused on improved performance, but also lower power consumption and heat generation. I was starting to get scared of Intel chips until that point, wondering if purchasing one might be akin to laying a lump of thermite on my floor, but they made a step in the right direction. I hope that they keep moving forward with cooler, less power-hungry chips.

Re:whats next (1)

Junks Jerzey (54586) | more than 7 years ago | (#16837752)

True, but I know at least with the Core 2 Duo Intel not only focused on improved performance, but also lower power consumption and heat generation. I was starting to get scared of Intel chips until that point, wondering if purchasing one might be akin to laying a lump of thermite on my floor, but they made a step in the right direction. I hope that they keep moving forward with cooler, less power-hungry chips.

It's still pretty scary, though. Sure, Core 2 Duo was focused on lower power consumption, but pick any Core 2 Duo notebook and do something that involves heavy computation on both CPUs and listen to the fans crank up. Feels like CPU makers are still walking a fine line between fast and actually usable in the kind of computers people want.

Personally, I'd like to see notebook makers focus on getting the kind of battery life you see in the Nintendo DS (but using a larger battery, of course). Even if speeds are knocked in half, that's okay.

Re:whats next (1)

Eivind (15695) | more than 7 years ago | (#16837092)

Longevity hasn't been a problem up until now. I've generally stopped using computers not because they've stopped working, but because there's a much better one available for a price that is low enough to make swapping viable. (economically advantageous even, since I use my computers for a living)

At work it's the same -- very few of our computers gets swapped because they're broken. Most gets swapped because it's bad business to have a $100/hour employee sitting around waiting for a computer worth $1000 to get around to doing it's thing. Especially when there's now a 3 times as powerful computer available for $1000.

This may change, but I doubt it. In my jurisdicaiton consumer protection laws ensure that consumers are covered against (non-abuse) defects for 5 years, so if a substantial fraction of computers start blowing up in less than 5 years, that'll be *really* bad business for Intel/AMD and friends. If they blow up *after* 5 years the consumer is out of luck. But most people are reasonably happy swapping computers every 5 years in any case.

Now power-consumption is a different matter. At work we care about this indirectly -- we demand silent machines, and powerhungry tends to equal noisy-fans (which disturb) At home I care too, even though many people don't. (or aren't aware that theres significant differences)

Re:whats next (4, Insightful)

Lumpy (12016) | more than 7 years ago | (#16837430)

You want to know what consumers want?? cheaper.

They are happy with their new Dell 1.8ghz pentium M laptops and that horribly oudated and incredibly slow P4-1.8ghz processor they bought 3 years ago.

Consumers are happy now. computers have stagnated hard for the past 3-4 years and the performance gains offered by this new stuff is only marginal for them.

On video editing, I can see the advances IF your app can take advantage of it, problem is current apps cant take full advantage of that processor until a new build or version is made to take advantages of it.

The consumer yawns and happily uses their old 3 year old PC or that cheapie from dell that cost them $299 with flat panel and is as slow. They dont care about 64 bit, dual or quad core.

at least until they buy a new OS and discover that the added bloat requires more processing power to display menus and movethe mouse cursor.

Re:whats next (5, Insightful)

archen (447353) | more than 7 years ago | (#16837536)

They don't care how much power the chip consumes or how much heat it dissipates,

Oh really? Now I can't say as far as Intel, but AMD has been very focused on power consumption for a very long time now. All of their literature is filled with benchmarks of power-per-watt and total power savings in the data center, etc. If AMD doesn't care about power consumption, then why would they specifically go to pains to offer CPU versions that are even MORE aggressive in their power saving if you pay a bit more for them? And with all of their power saving innovation and dedication what do they get? Intel now outperforms them and everyone jumps the ship and goes over to the Intel side (despite the fact that the lower power versions of AMD's CPU still use less power when the final weight with the chipset is done).

You know why they care about what performance the average consumer sees? Because that's all consumers care about. If it were otherwise you wouldn't be seeing your lights dim when your graphics card goes into high gear. Where are the "power conscious" versions of these graphics cores?

I've got a lot of Athons, and Athlon XP's running where I work. Some burn out but that's often because of their environment and due to the fact that the fan that comes with the heatsink for the OEM version is garbage almost guaranteed to burn out after a year in high dust environments. The Pentium 4 is history, even Intel admits it was on the wrong track. If you want more longevity, then get a robust heatsink fan (undervolted) and underclock your CPU. You DO underclock your CPU right?

Power consumption (1)

Lonewolf666 (259450) | more than 7 years ago | (#16838070)

Actually both AMD and Intel have improved in heat & power consumption.

On Intel's side, the Core2Duo has a lower power consumption than the P4, despite having two cores. If you get the "smallest" version, the E6300, you should end up with a PC that has moderate power consumption combined with very nice performance.

AMD has recently lowered the prices on their "energy efficient" series of dual cores, and the availability has improved. If you buy an Athlon 64 EE 3800+ or an EE 4200+, you should also get a nice ratio of performance to power consumption, both at a somewhat lower level than with a Core 2 Duo. BTW, the higher-clocked models are IMHO overpriced when compared to the Intel Core2Duo.

Re:whats next (2)

wile_e_wonka (934864) | more than 7 years ago | (#16838286)

My understanding was that these chip makers have actually been forced by chip manufacturers to watch power consumption and heat dissipation--by purchasers of laptops. We don't like our battery dying an hour into a meeting, and we don't like our computers to burn our laps in the airport, but we do like our computers to be fast (time is money; I can't afford for my laptop to take five minutes to turn on). I think they first addressed this really with the Pentium M Centrino application.

Also, I could be wrong, and I don't have any numbers in front of me right now, but my understanding was that the dual core chips generally did run cooler and were less power hungry than the high end single core chips.

Re:whats next (1)

pkulak (815640) | more than 7 years ago | (#16838736)

I'm sure Pixar is pissed that processors keep get faster.

Re:whats next (1)

J.R. Random (801334) | more than 7 years ago | (#16838744)

They don't care how much power the chip consumes or how much heat it dissipates, they only care about what the average consumer sees as immediate performance.

Actually, they care very much how much power the chip consumes, because this matters in both the server markets at the top and the laptop markets at the bottom. If I recall correctly, AMD's quadcore chip will be able to put individual cores into low power mode when they aren't needed. Of course the highest clocked AMD or Intel chips require their own nuclear reactor to power them, but that's because they're targeted towards the hard core gamer market and it's the hard core gamers who don't care how much power the chip consumes.

New Quad Core? (1, Offtopic)

smitty_one_each (243267) | more than 7 years ago | (#16837216)

New Quad Core?
Earth warming more.
Or so says
The junior Gore.
Coolness to your every pore:
Burma Shave

640 (-1)

suso (153703) | more than 7 years ago | (#16836858)

I thought they said that 2 cores would be enough for the desktop for the foreseeable future

*Studio Audience laughs*

Re:640 (-1, Troll)

Anonymous Coward | more than 7 years ago | (#16837086)

Me thought I must get me one of dem thar quad core muhas to browse da intarweb and look dem family photos. Den I red da article and realize that computers compute:

Kentsfield, a 2.66GHz chip with a 1066MHz front-side bus, is more for computational-heavy usage
Who wudda thunk it? computer?

Re:640 (2, Informative)

bealzabobs_youruncle (971430) | more than 7 years ago | (#16837564)

They still are, this isn't aimed at average desktop usage, RTFA.

damn it (0)

Anonymous Coward | more than 7 years ago | (#16836884)

new processors out every month, I've bought an iMac 24" right now, to find that in 3 months they'll get another upgrade.

Why downplay it? (4, Insightful)

Salvance (1014001) | more than 7 years ago | (#16836886)

"Core 2 Extreme quad-core QX6700" - There's a mouthful. It's funny that Intel is continually trying to downplay the importance of this chip for the average user. They say it's best for "more for computational-heavy usage, including digital content creation, engineering analysis, such as CAD" ... sounds like gamers would flock to this. Maybe they realize it's a rushed product (to beat AMD to the punch), and it will be in short supply?

Re:Why downplay it? (2, Insightful)

Anonymous Coward | more than 7 years ago | (#16836948)

Or it may be that most games are not optimized for multiple cores. If they target gamers, only for gamers to discover that there is little improvement over their previous processor, then Intel's image with gamers would be damaged. However, some game companies such as Valve have recently started to embrace multi-core processors, but it will be awhile before new games are published that take advantage of those extra cores.

Re:Why downplay it? (4, Insightful)

GauteL (29207) | more than 7 years ago | (#16837160)

Currently the quad-core is pretty useless for gamers unless you like to run video encoding apps at the same time as you play your game.

The reason is of course, that most games are barely optimised for dual cores, let alone four cores. It is not simple either as balancing several cores to get the most out of them requires a redesign of the game engine.

It will be significant for future games, but you are better off buying a high-end dual core now and replacing it with quad-core later on.

Re:Why downplay it? (4, Insightful)

oojah (113006) | more than 7 years ago | (#16837234)

you are better off buying a high-end dual core now and replacing it with quad-core later on.

Right. The best bit about quad core for the moment is that it should drive the dual core prices down.

Cheers,

Roger

WTF? (5, Funny)

LibertineR (591918) | more than 7 years ago | (#16837366)

"Currently the quad-core is pretty useless for gamers unless you like to run video encoding apps at the same time as you play your game."

What? I thought EVERYONE used WinDVR to encode MPEG2 files of Battlestar Gallactica from their TIVO while playing F.E.A.R., and turn it into H262 for uploading as a a killer torrent while kicking but in Call of Duty 2? I suck the life out of an X2 4400 bitch, and I am NOT alone.

We cant all have a life, so I NEED that chip!

You insensitive clod!!!

Re:Why downplay it? (1)

xplusaks (1027094) | more than 7 years ago | (#16837732)

We know a man once upon a time said "I think there is a world market for maybe five computers. - Thomas Watson".

and after that we get a flash from blues ... one must learn from history.

Re:Why downplay it? (1)

TheBogBrushZone (975846) | more than 7 years ago | (#16837854)

I think it is down to the development cost for games writers and publishers. Optimising for multiple cores (and for different numbers of cores from 1 upwards) requires at minimum a good understanding of concurrency (i.e. computer scientists) to write new code or at worst a complete re-write of an established and tested geometry engine to support vector processing. Add this to the uncertaintly of whether dedicated geometry processors (or GPUs running geometry) will be a more popular, effective or cost-effective solution and you get a CPU development that while it may technically be capable of increasing game performance simply has no current commercial support.

Re:Why downplay it? (1)

mikael (484) | more than 7 years ago | (#16838092)

Perhaps the CPU manufacturers are desperately competing against the GPU manufacturers for developers of scientific applications? Nvidia just announced their 8800 series GPU's with support for BLAS [netlib.org] , a foundation library for intensive engineering calculations.

All the engineering, digital content creation, and gaming use similar algorithms to model/visualize water, fire and smoke. However, engineering does require high precision (64-bit floats) while animation and gaming can get away with lower precision (16-bit floats).

It's really going to be overkill to use a quad-core CPU with 64-bit precision floating-point units to run a single-threaded game engine that does all the animations effects on the GPU. It might still be overkill even if a game engine had separate threads for handling player input, game server communication, physics and rendering.

Re:Why downplay it? (1)

Pollardito (781263) | more than 7 years ago | (#16838166)

"more for computational-heavy usage, including digital content creation, engineering analysis, such as CAD"
that's a polite way to say "it's for stuff that does a lot of work on a little data, because those 4 processors are clogged on the same data pipes", in other words "not most games"

Re:Why downplay it? (1)

mspohr (589790) | more than 7 years ago | (#16838706)

I think the GPU (graphics process units) people have a big advantage for gamers (and all digital content creation tasks). Most GPUs are 10-20 times as fast as CPUs for their specialized functions so a "58%" speed improvement with the quad core CPU doesn't mean anything.

Re:Why downplay it? (1)

FuturePastNow (836765) | more than 7 years ago | (#16838718)

Why downplay it? Because it costs a grand. No matter how much benefit it might be to the "average user," the price of the chip and the systems that use it will drive people away.

Enthusiasts want Kentsfield and are willing to pay for it. Average people may want it, but are only willing to pay for Celeron.

Office Apps (4, Funny)

Ginnungagap42 (817075) | more than 7 years ago | (#16836924)

So how does Minesweeper run on it?

My recommendation (3, Funny)

LeedsSideStreets (998417) | more than 7 years ago | (#16837692)

For Minesweeper, you should have at least one processor core per game square.

It's the only way to play.

It doesn't! (0)

Anonymous Coward | more than 7 years ago | (#16837708)

M$ EULA let you run M$ Minesweeper Operating System (Windows) only on two CPUs, AFAIK!

Re:Office Apps - MineSweeper (1)

Nom du Keyboard (633989) | more than 7 years ago | (#16838716)

So how does Minesweeper run on it?

You can now sweep mines out of four oceans simultaneously.

WoW-Core (1)

digitaldc (879047) | more than 7 years ago | (#16836932)

All I want to know is if QuadCore will make my World of Warcraft Elite battle load and display 4-times faster?

Re:WoW-Core (1)

skinfitz (564041) | more than 7 years ago | (#16837220)

As there is sure to be demand for this as multi-cores become more popular, if it doesn't then it's sure to happen in the near future.

Re:WoW-Core (1)

gyranthir (995837) | more than 7 years ago | (#16838062)

WoW at the moment doesn't even support dual core processors. And they have been out going on 2.5 years now. I doubt they will be updating to quad core support anytime soon. Anyway World of Warcraft is barely hardware taxing anyway. Hell it runs on a Pentium 3 900mhz with a GeForce2 in it.

Re:WoW-Core (1)

TheBogBrushZone (975846) | more than 7 years ago | (#16838382)

WoW does support dual core (or did when I last played 2 months ago). Blizzard have stated that WoW has two threads but one of them is significantly more processor-intensive than the other so it takes some very limited advantage of a second core. It does (or did?) however experience some strange bugs with dual core such as an FPS limit of 64 and stuttering graphics when turning around.

Re:WoW-Core (1)

gyranthir (995837) | more than 7 years ago | (#16838770)

And constant crashes. Fine let's put it this way, it barely supports dual core architecture in the most crippled, limited, and rudimentary form possible with an absurd amount of errors and problems coming from trying to support it.

Re:WoW-Core -- Need More (1)

Nom du Keyboard (633989) | more than 7 years ago | (#16838756)

...load and display 4-times faster?

Only if you have 4 disc drives to feed the four cores simultaneously.

creators deliver newclear power to the masses (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#16836944)

the best has yet to come...., & there's never any subscription/liesense fees/cover charge.

from previous post: many demand corepirate nazi execrable stop abusing US

we the peepoles?

how is it allowed? just like corn passing through a bird's butt eye gas.

all they (the felonious nazi execrable) want is... everything. at what cost to US?

for many of US, the only way out is up.

don't forget, for each of the creators' innocents harmed (in any way) there is a debt that must/will be repaid by you/US as the perpetrators/minions of unprecedented evile will not be available after the big flash occurs.

'vote' with (what's left in) yOUR wallet. help bring an end to unprecedented evile's manifestation through yOUR owned felonious corepirate nazi life0cidal glowbull warmongering execrable.

some of US should consider ourselves very fortunate to be among those scheduled to survive after the big flash/implementation of the creators' wwwildly popular planet/population rescue initiative/mandate.

it's right in the manual, 'world without end', etc....

as we all ?know?, change is inevitable, & denying/ignoring gravity, logic, morality, etc..., is only possible, on a temporary basis.

concern about the course of events that will occur should the corepirate nazi life0cidal execrable fail to be intervened upon is in order.

'do not be dismayed' (also from the manual). however, it's ok/recommended, to not attempt to live under/accept, fauxking nazi felon greed/fear/ego based pr ?firm? scriptdead mindphuking hypenosys.

consult with/trust in yOUR creators. providing more than enough of everything for everyone (without any distracting/spiritdead personal gain motives), whilst badtolling unprecedented evile, using an unlimited supply of newclear power, since/until forever. see you there?

Will it be supported... (3, Insightful)

AppreticeGuru (1024775) | more than 7 years ago | (#16836952)

4 cores is great and all, but I know they are still working on support for games such as many Steam offerings with only 2 cores in terms of multi-threading, so I'd have to imagine that game support to really take advantage of a 4-core system would be a long way away. I was still psyched about the low voltage powerhouses for laptops, and I'm wondering how much extra heat 4 cores are going to put out as well. How many apps are really geared to take advantage of 4 cores atm, really?

Only 50 percent speedup on two cores (1)

Latent Heat (558884) | more than 7 years ago | (#16837338)

I have a number-crunching program which I rewrote multi-threaded. It is simple enough -- I need 100,000 calculations of the same thing only with different parameters each time, so run them in parallel, two at a time for two threads, four at a time for four threads. I still have the bookkeeping of saving the results of each calculation, and I have streamlined the calculation enough that the bookkeeping part is a significant piece of the overall time.

I am about 50 percent faster on two cores -- I am guessing I will be maybe another 20 percent faster on 4 cores. If we get the Che Gueverra number (1, 2, many cores), I am not sure how this helps without a radical rethinking of how we write programs.

Memory bandwidth (1)

Krischi (61667) | more than 7 years ago | (#16838374)

Are you sure that the bookkeeping is the limiting factor in your program? It could also be that you simply run into a memory bandwidth bottleneck. When you have datasets that do not fit in the L2 cache, main memory simply cannot keep up with the amount of data that the CPUs chew through. With multiple cores that all share the same memory link, this problem is going to become only worse.

One more argument in favor of the AMD Opteron architecture, even though for single-threaded applications it is currently slower than Intel's best offerings. At least with the Opteron, you can have dedicated memory for each node.

I can't help but feel they're... (2, Funny)

NoMoreNicksLeft (516230) | more than 7 years ago | (#16836956)

Missing a marketing opportunity. ... now with Intel Foursome!

Re:I can't help but feel they're... (1)

H0Z3R (1027074) | more than 7 years ago | (#16837000)

HA! Where are the Blue group actors when you need to introduce the Quad chips?! News headline. Bluegroup adds performer to support Intel launch of advertising campaign to showcase the new Quad Core CPU! ... yeah...

Re:I can't help but feel they're... (1)

stupidfoo (836212) | more than 7 years ago | (#16837236)

Worst. Post. Ever.

Re:I can't help but feel they're... (3, Funny)

Anne_Nonymous (313852) | more than 7 years ago | (#16837632)

>> ... now with Intel Foursome!

Sounds... hot.

Re:I can't help but feel they're... (0)

Anonymous Coward | more than 7 years ago | (#16838630)

Little did they that they could have capitalized on the sex appeal with the Dual Core, as a two-some is more than most nerds get...

overkill (1)

thejrwr (1024073) | more than 7 years ago | (#16836960)

What they need to do is make a Muti-Core NATIVE OS, so even single-thread apps can use more then 1 core, also why dont they just make dual-core processors faster! seems the only way we are going to get ahead in the field

Re:overkill (4, Informative)

pla (258480) | more than 7 years ago | (#16837124)

What they need to do is make a Muti-Core NATIVE OS, so even single-thread apps can use more then 1 core

Other than jumping between cores to improve heat dissipation, how do you propose to make a highly serially-dependant algorithm run on more than one core at a time? Until computers can actually make programmers redundant by writing their own code given a high-level English description of the task (and even then, you'll still have some proveably-serial code), multithreading will remain at the whim of the programmers, not the scheduler.



also why dont they just make dual-core processors faster!

For the same reason they stopped the MHz-wars and moved to a core-war in the first place... Making each core faster has started to hit physical limits (power draw and heat dissipation, electron migration in progressively smaller transistors, clock speeds limited by the speed of light across the width of the chip, etc). Make no mistake, the speed will keep creeping up over time, but the end of 18-month speed doubling ended a few years ago. Major new improvements will either involve radical new technologies (and no, spintronics and diamond substrates will only yield incremental improvements) such as quantum, or what we see now, the move toward massive parallelism.



seems the only way we are going to get ahead in the field

Gaming, while interesting, does not drive research into the highest end of computing.

Re:overkill (1)

DimGeo (694000) | more than 7 years ago | (#16838540)

...how do you propose to make a highly serially-dependant algorithm run on more than one core at a time?
Intel had a research on this. I think the idea was more or less the same as having multiple pipelines in a single core. If it could be integrated into the OS, or better of, on the chip itself (and make it manifest itself as a single CPU to the system), some nice results could be done, I suppose, but then we're still where we were a few years ago with a single CPU with many pipelines... only faster...

Re:overkill (1)

Name Anonymous (850635) | more than 7 years ago | (#16837670)

What they need to do is make a Muti-Core NATIVE OS, so even single-thread apps can use more then 1 core

Actually many of the Unix and Unix like OSes out there do something like this. Anytime a system call is made, you might wind up with multiple threads for the application.

Also, at the very least, on a dual core (or dual processor) system, you get one core running your heavy application and the other one running the rest of the stuff - IM client, mail client, system stuff, etc.

As it stands, on my Mac I have a handful of applications that make pretty efficient use of 2 cores. And of course not all applications are worth writing as threaded applications as they don't use enough resources for it to matter. As the hardware gets ot there, the applications will follow.

Re:overkill (1)

LurkerXXX (667952) | more than 7 years ago | (#16838546)

That's kind of like asking why chips don't have just one really-really-really fast pipeline rather than several. With several you can do more things at once.

There will always be a speed limit your electronics can go at. We are pushing against some heat/size limits. The most realistic way to go faster is to split up the tasks as much as possible and have multiple piplines/cores/CPUs/Computers work on them at once.

Re:overkill (1)

99BottlesOfBeerInMyF (813746) | more than 7 years ago | (#16838856)

What they need to do is make a Muti-Core NATIVE OS, so even single-thread apps can use more then 1 core...

There are limits to this of course. One interesting step is OS X 10.5's ability to spawn a special GPU feeder process for OpenGL apps. The idea is for some OpenGL libraries and some things that are OS functions to automatically become their own thread, running on another core, thus speeding up programs that are CPU bound but have not been reworked/recompiled to take advantage of multiple cores. At least that is my understanding. In a perfect storm it could double the speed of an OpenGL app.

Say hello ... (0, Offtopic)

psergiu (67614) | more than 7 years ago | (#16836966)

... to the next mac pro [apple.com] .

What about other parts of the computer? (1)

s31523 (926314) | more than 7 years ago | (#16836988)

I know CPU power is a big factor in performance, but c'mon.. What about extending the rest of the motherboard? I bet things would run faster in dual/quad core mode if there were dual buses so that bottlenecks are reduced to peripherals and memory.

Re:What about other parts of the computer? (1)

larkost (79011) | more than 7 years ago | (#16837300)

That is the whole point of having multiprocessor systems.

Re:What about other parts of the computer? (1)

10Ghz (453478) | more than 7 years ago | (#16837362)

the FSB's have been getting faster, the expansion-buses (PCI, PCI-E etc.) have been getting faster, memory has been getting faster, and they reside on wider buses.

If you complaint is the FSB; the AMD has something better for you. So what are you looking for here really?

Re:What about other parts of the computer? (4, Interesting)

s31523 (926314) | more than 7 years ago | (#16837526)

Faster, good... Wider, good... But why not parallel with dual-DMA? Right now, it seems you could have 10 cores, but if all the threads running on each core have to contend for 1 bus, it doesn't matter how fast the bus is. I want each core to be able to access its own memory so it is not blocked by the other core's if it is accessing memory. I want one core to be able to access my NIC while the other accesses the hard drive and the other access the video card. All this requires some sort of parallel bus setup. It is my understanding we have not done this sort of architecture yet, but if we keep increasing the number of processor cores, this would seem to be the next step. BUT, I am not a hardware guy. I am a software guy, and expect it all to just work! :)

Re:What about other parts of the computer? (1)

i23098 (723616) | more than 7 years ago | (#16837796)

I want each core to be able to access its own memory so it is not blocked by the other core's if it is accessing memory

I believe you're talking about NUMA [wikipedia.org] computers...

I want one core to be able to access my NIC while the other accesses the hard drive and the other access the video card.

If the BUS is fast enough (compared to device speed) you can simulate it, just like a single core computer can do multitasking...

We're not quite there yet, but I believe you've seen the future... a real close future ;)

Re:What about other parts of the computer? (3, Informative)

10Ghz (453478) | more than 7 years ago | (#16837876)

"I want each core to be able to access its own memory so it is not blocked by the other core's if it is accessing memory."

Say hello to AMD, HyperTransport and integrated memory-controllers. Each CPU has it's own bank of RAM, and Each CPU is directly (well, 8-socket system needs one intermediate jump) connected to the other CPU's, and they can access the RAM connected to the other CPU's as well. So if you have dual-socket system, each socket has it's own RAM-bank, with 128bit bus between the CPU and the RAM, and the CPU can access the RAM attached to the other CPU as well. So as the number of CPU's goes up, the memory-bandwidth goes up as well.

This tech has been used since 2003 in the AMD's x86-64 CPU's. In the future AMD will have systems where you can plug co-processors and vid-cards to HyperTransport-sockets, alloweing them to directly communicate with the CPU's.

Re:What about other parts of the computer? (1)

s31523 (926314) | more than 7 years ago | (#16838034)

But... Does it come with a Hemi? :)

Nice! I gotta get me one of those!

Re:What about other parts of the computer? (1)

kabocox (199019) | more than 7 years ago | (#16838222)

Faster, good... Wider, good... But why not parallel with dual-DMA? Right now, it seems you could have 10 cores, but if all the threads running on each core have to contend for 1 bus, it doesn't matter how fast the bus is. I want each core to be able to access its own memory so it is not blocked by the other core's if it is accessing memory. I want one core to be able to access my NIC while the other accesses the hard drive and the other access the video card. All this requires some sort of parallel bus setup. It is my understanding we have not done this sort of architecture yet, but if we keep increasing the number of processor cores, this would seem to be the next step. BUT, I am not a hardware guy. I am a software guy, and expect it all to just work! :)

Um, wait for the 3rd or 4th generation of this before you buy it than. I'd predict that they'll get around to after awhile, but as soon as we'd like. I'm kinda startled that we've got news of quads already. I don't really pay attention to the chip core wars any more. If this keeps up, I'll turn around and they'll be at 8 or 16. Remember they aren't selling this to the /. crowd, Intel and AMD are switching from mhz to cores and aiming at mom and pop or Joe Average Walmart buyer. I guess that I can see reasons for the quad already. You can have the walmarters get on the lower end duo core and brag that they have more than one. Then you have gamers or really high end folks, or early adopters brag that they have four cores. The software advantage won't be there for awhile. Remember this is as much a marketting thing as a tech thing. Intel and AMD aren't rushing into this. I'm wondering how long will it be before we run into "core limits." Will 16, 64, 256 cores be the limit? We'll findout over the next 5-10 years if this is the area Intel and AMD plan on developing.

Not native Quad core (3, Informative)

kid_oliva (899189) | more than 7 years ago | (#16836992)

From what I've read about Intel's quad-core; it is not native like AMD's will be. They are basically are going to have two dual core's and they will communicate via FSB. That sucks compare to AMD's offering which will be native.
http://xstremehardware.co.uk/index.php?option=com_ frontpage&Itemid=1&limit=10&limitstart=20 [xstremehardware.co.uk]

Re:Not native Quad core (2, Insightful)

k_187 (61692) | more than 7 years ago | (#16837134)

yes, but communication between the 2 cores in each of the sets will be faster than any of AMD's cores. My guess is that it'll be a wash. The other (in my opinion more important) thing, is that INtel is shipping now, while AMD is about a year away. By then I believe that INtel will have a quad-core on die chip out. Either way, more FPS!

Re:Not native Quad core (1)

Threni (635302) | more than 7 years ago | (#16837150)

> yes, but communication between the 2 cores in each of the sets will be faster than any of AMD's
> cores

Isn't it also going to use a lot more power than a native 4 core though?

Is that true? (1)

Junta (36770) | more than 7 years ago | (#16837550)

I doubt intercore communication in the Intel design is any better than the Barcelona (it's impossible to say this far out in any event). What is known (I have worked with some of the quad core Intel stuff) is that the two dies produce a higher load on the FSB and require the FSB be clocked down from the equivalent dual-core model. This means that AMDs remaining advantage over Intel's offerings is made more drastic (aggregate memory performance, particularly in multi-socket configurations). I.e. an Intel that thoroughly spanks a high-end two socket AMD offering linpack wise (4 flops/clock), will offer as low as 33% of the stream performance as the AMD offering. So, particularly at the high end, there remains no clear answer about which solution to pick, as Intel currently far and away has the best performance once the data has reached the cache, but if the data set being operated on within small periods of time exceeds the cache, AMD can still win. This is one of the reasons hpcc has merit for measuring multiple aspects of a cluster (i.e. aggregate memory performance, node interconnect, as well as traditional linpack tests), it's not so simple to say what is the best unless an architecture allows for superior performance across the board.

Re:Is that true? (1, Insightful)

dfghjk (711126) | more than 7 years ago | (#16837992)

When comparing quadcore approaches, aggragate memory performance of multisocket, multi-memory controller designs is irrelevant. No doubt the AMD approach scales better but that's not important to the argument. When AMD announces a single die processor with multiple integrated memory controllers then it matters. Offsetting AMD's memory throughput advantages are Intel's much larger caches. It's a complicated subject.

Intel's approach gets quadcore to market far faster, and once AMD can deliver quadcore on a single die Intel will be able to do the same. Meanwhile, there is no evidence that Intel MCM approach is substancially inferior performance-wise. AMD's shortterm response, 4x4, is quite a joke by comparison. Anyone worrying about power consumption with Intel's solution isn't concerned with AMD's 4x4 design? I would much prefer a single processor, single memory controller system with 90% of the performance of the AMD dual proc beast. Of course, I pulled that number out of my ass...

Re:Is that true? (1)

k_187 (61692) | more than 7 years ago | (#16838018)

It actually is, as each pair of cores share their L2 cache. So between each pair its better, but not between the 2 pairs (as this is covered by the FSB) nor between the whole and the system. WE'll see if this makes up for the FSB problem (probably not)

Re:Not native Quad core (2, Informative)

OrangePeril (739827) | more than 7 years ago | (#16837704)

True, it is not native quad core. However AMD's first venture into quad-core will not be native either. In an effort to catch up to Intel, they will also be releasing a quad-core processor thats "taped together" as Intel's is.

I recently met with an Intel rep and they are very much pushing their new core architecture. Quad-core this year, Octo-core next.. Core count is the next clock speed. However one of it matters until the software manufacturers can take advantage of it, and very few server applications can at this point, let alone games.

Reference: http://www.itwire.com.au/content/view/7120/53/ [itwire.com.au]

Re:Not native Quad core (2, Informative)

dfghjk (711126) | more than 7 years ago | (#16838098)

In the timeframe Intel offers this, AMD will have no quadcore part at all. Considering that, it's clear that AMD sucks, not Intel. Later on, Intel's "native" version (Yorkfield, discussed in your link) will have cache improvements and a bump in FSB speed. All things considered, the dual die part doesn't look like it sucks at all (except for AMD).

There are three sides to this: Intel's, AMD's, and the truth.

Re:Not native Quad core (1)

fitten (521191) | more than 7 years ago | (#16838154)

From what I've read about Intel's quad-core; it is not native like AMD's will be. They are basically are going to have two dual core's and they will communicate via FSB. That sucks compare to AMD's offering which will be native.

Intel's method is designed to give higher yields than a "true" quad core approach. In any case, take a look at the performance numbers and see how much it hurts not being "true" quad core. Sure, that doesn't mean that there's no room for improvement but the "true quad core" argument is sort of fanboish.

For example, AMD fanbois tend to trot out synthetic benchmark bandwidth numbers and the IMC all the time, but if you look at a large swath of benchmarks, even though an AthlonX2 has higher synthetic benchmark memory bandwidth scores, see which processor of the Core2Duo and the AthlonX2 has higher non-synthetic benchmark scores.

Funny thing happened on the way to IT Support... (2, Interesting)

topham (32406) | more than 7 years ago | (#16837064)


We were issued laptops at the start of the project. Typical laptop is a Thinkpad T42p. They average somewhere between 1.6Ghz and 1.8Ghz.

Some people were complaining about performance (java is a hog, and they were using stuff that makes java look 'light'). so they requested new machines.

They were issued Core 2 Duo systems that are 1.8Ghz, with 2 Gigs of ram. This machines are nice. They guy from IT Support comes up to replace the system and starts saying that he doesn't know why we would upgrade to the desktops, they are the same speed as the laptops.

Ok, I expect that from some guy off the street, but IT Support?

(Note: For this work there is a significant speed difference, it is obvious, and almost immediate.)
Never mind the differences between a single core from a Core 2 Duo, and the core used in a Thinkpad anyway...

Re:Funny thing happened on the way to IT Support.. (0)

Anonymous Coward | more than 7 years ago | (#16837130)

Send benchmarks to management with a report of what this guy said and ask if you can do your own support.

NEEDS MORE RAM (0)

Anonymous Coward | more than 7 years ago | (#16837072)

Unmarketable schlock!

I have one of these babies (3, Funny)

Nichotin (794369) | more than 7 years ago | (#16837090)

.. since I am a journalist for a computer rag. Anyway, I would say it is a waste of money for most people at this time. Applications can barely use two cores properly, and games are still not as SMP aware as they should. On the other hand, if you run gentoo, THIS CPU IS KILLER :)

Re:I have one of these babies (2, Insightful)

mgblst (80109) | more than 7 years ago | (#16837152)

You know, it doesn't take being a computer journalist to realise that any chip released in the last 3 years is a waste of money for most people. Most people mainly use the computer for broswing the net, and despity Intels previous claims, a faster processor won't make any difference. And despite adverts on UK tv reporting that with the new dual cores, you can read email and listen to music, you don't need a 4 core or 2 core to do any of that.

The whole thing is a joke, for most people. Like cars that go 1000 mph, what is the point!

Re:I have one of these babies (0)

Anonymous Coward | more than 7 years ago | (#16837180)

The whole thing is a joke, for most people. Like cars that go 1000 mph, what is the point!
You naysayers piss me off. The point will come when everyone has one.

Re:I have one of these babies (0)

Anonymous Coward | more than 7 years ago | (#16837208)

As GP said, for large builds having multi cores rock. For virtualization, multi cores rock. For content creation or transcoding, multiple cores rock. If you want to be pedantic, most people don't need computers, games consoles, TV, iPods or cell phones at all. Me, I want one.

Re:I have one of these babies (1)

laffer1 (701823) | more than 7 years ago | (#16837616)

It depends if people are using flash. Try running a flash 9 intensive site on an old PC or Mac. It will not keep up. To some degree, you do need a faster processor to handle flash and the new "High Def" video codecs coming out. It entirely depends on what you use the net for, but it is important to some people.

I noticed a difference upgrading from a Dual Xeon 2.0Ghz to a Dual Core Pentium D 805 (2.66Ghz) with quicktime streams for instance. Aside from my poor choice in video card (Geforce 7300), my new system is much faster than my old system. I didn't buy the new system to speed up quicktime. I just wanted a 64bit processor for experimentation purposes.

Re:I have one of these babies (1)

r3m0t (626466) | more than 7 years ago | (#16838662)

"adverts on UK tv reporting that with the new dual cores, you can read email and listen to music"

I complained to the Advertising Standards Agency about one of those adverts which was made by PC World. I'm still sour about it. I quote from the letter:

"We did not consider that the advertisement implied that dual core processors were the only type of processor that could multi-task, or that they improved internet connectivity or performance. [the sales guy said something like 'playing a game while downloading music'] We consider that the advertisement is merely highlighting the functionality of a dual core PC, and is therefore unlikely to mislead." (Followed by some crap about how considerate and fair they were.)

So I guess those adverts are here to stay.

Re:I have one of these babies (4, Interesting)

pla (258480) | more than 7 years ago | (#16837326)

since I am a journalist for a computer rag.

I will say "lucky bastard", but that also explains your follow-up comment:



Applications can barely use two cores properly, and games are still not as SMP aware as they should.

Although apps and games have started to improve their multithreading, you don't get multi-core for single-app performance. You get it so you can play a modern FPS at the same time you have DVD Shrink backing up a movie for you, with little to no slowdown to either. With a quad core, you can add in two more CPU sucking tasks, again with little to no slowdown (though currently, memory needs to catch up to task of dealing with more cores).

Six(ish) years ago, I got my first dual CPU machine. Almost nothing except the OS itself ran multithreaded at that time. And the improved performance of the machine just blew me away - Only last year did I eventually decommission that ancient dual CPU box because modern single-CPU speeds had passed it (and I still would have held out, except for the knowledge that I could do an in-place upgrade to a dual-core CPU whenever I wanted to).

So you may not see the point of multi-cores, when your favorite game won't run any faster on four than on one. But that doesn't even come close to meaning that "most" people won't benefit. Quite the opposite, I'd say that only hard-core gamers wouldn't benefit. Everyone else will feel the improved responsiveness the first time they touch a multi-core box.

Re:I have one of these babies (1)

CodeMasterPhilzar (978639) | more than 7 years ago | (#16838418)

Lucky you! (to have one of them to play with)


Where I work, we run a huge simulation in real-time. Right now, today, I could use every single cycle they have on their quad cores and more. Bring it on! Right now we have to reconfigure the simulation for highest fidelity but less than real-time, vs real-time at lower fidelity. Give me more cycles, more cores -- the sim is multiprocess -- and I can maybe, maybe go hi-fi and real-time. Of course, as soon as we have more CPU horsepower, we'll up the fidelity even more...

Two what? (1, Funny)

Anonymous Coward | more than 7 years ago | (#16837140)

However, when only running two threats, "Kentsfield is quite wasteful with its power consumption," Shimpi said.

Two threats? You surely aren't running Windows, man...

Soon (2, Insightful)

Mateorabi (108522) | more than 7 years ago | (#16837192)

Soon the number of cores in my desktop machine will surpass the number of blades in my shaving razor.


But seriously, as it gets harder and harder to make larger CPUs run faster the trend is going to be more, smaller processors per die. Each core is by itself slower than a huge monolithic one, but the sum is greater thanks to non-linear scaling. The trick is getting software to efficiently utilize them all.

Hype (4, Funny)

h2g2bob (948006) | more than 7 years ago | (#16837196)

It's going like razorblades - the razorblade companies just try to outdo each other on how many blades that can be placed in a single razor. At this rate, expect as many processors as you can physically fit on, plus an extra processor for those tricky, hard to reach programs.

Multi use appliances (1)

Lost Penguin (636359) | more than 7 years ago | (#16838064)

The Intel computer/rangetop will fry hamburgers faster than the AMD version, with no loss of computing power

Bigger Better Faster (0)

Anonymous Coward | more than 7 years ago | (#16837228)

Saying quad core is overkill is like saying my Porsche has too much horsepower. Or that my girlfriends bewbies are too big. Sure it might be more than you need on your daily commute, but it's nice to play with on the weekend.

Bigger, faster, and it's never soon enough.

Re:Bigger Better Faster (1)

GNious (953874) | more than 7 years ago | (#16837464)

you need your girlfriend's *what* on your daily commute???

/G

I blame Gillette (1)

gjuk (940514) | more than 7 years ago | (#16837258)

Remember when one blade was enough?

Benchmarks! (2, Informative)

Ironsides (739422) | more than 7 years ago | (#16837400)

Here's one from Toms Hardware. [tomshardware.com]

Intel's right. On games it doesn't do any better. On video though, well, lets just say I know some architecture majors who would have loved these in their lab several years ago, when 1 frame took 10 minutes to render. And they had 300 frame videos to do.

mod 0p (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#16837580)

vitality. Like an that support said. 'Screaming these rules will big deal. Death About ou7side Violated. In the End, we need you BSD machines while the project move forward, free-loving climate used to. SHIT ON that has lost during which I new faces and many between each BSD Time I'm done here, look at the ago, many Of you 7000 users of wasn't on Steve's is the worst off very sick and its with process and endless conflict Fact came into here, please do developers. The GOALS. IT'S WHEN I'll have offended BSD sux0rs. What erosion of user that should be big deal. Death taken over by BSDI myself. This isn't never heeded fact there won't Discussion I'm halt. Even Emacs

Vista? (0)

Anonymous Coward | more than 7 years ago | (#16838576)

Quad core, 2.8Ghz, 4Gb RAM, ...

Should allow Vista to raise up to "sluggish".

Hannah Barbaras two cents (1)

flyneye (84093) | more than 7 years ago | (#16838606)

flyneye:I'm siting here today with the famous snaggle tooth whose home cluster has been busy with the new quad processor.So,what say snag,is this processor cool or what?
snaggle tooth: Heavens to Mergatroy, I've got an erection,a big one even.
flyneye:Well there you have it folks,Intel in the lead for the next 5 minutes.

Who is "Average"? (1)

Hercules Peanut (540188) | more than 7 years ago | (#16838666)

"For the average office worker who uses their PC for general productivity apps, such as communications and garden-variety computing, Smith recommended the Core 2 Duo from 'a price point and performance perspective.'"

I admit it. I'm not as hard core as a lot of Slashdot. I'm not a hacker, a programmer, and I barely use linux (mostly for tivo modding..o.k. I'm a little bit of a hacker, but just a little). Still, when I see the new chips like these that are much faster for digital content creation and not for the "Average office user", I find myself scratching my head. Sure the cubicles running MS Office and IE don't need this power, but the "Average" home user may very well.

Think about it. What is the average home user doing? I think it has a lot to do with digital audio and video. We are making home movies, converting our DVD collection to mp4s and mixing our own music. Most of this can be done with iLife if you are a Mac user (for example) but the hundreds of gigs of video I have of my family requires far to much of my time to "Rip, Mix and Burn".

Am I so different? It seems to me that the high-end workstation and the "average" home pc user really want and need the same thing from a productivity standpoint. For those of us that want to move into the digital home lifestyle, processing power is still a limiting factor. I for one find myself setting up my computer to encode video overnight far too often.

Does anyone else see the top of the line high-end processor as a very useable tool for the real average computer user or am I expecting too much from the average user? Really, even my dad wants to make digital videos of the family from time to time but doesn't have the hours it takes to do it.

Yeah, Right (3, Insightful)

Nom du Keyboard (633989) | more than 7 years ago | (#16838676)

Their Quad Core offering, Intel is claiming, is up to 80% faster than the dual-core Conroe released this past July.

Yeah, that much faster on carefully selected software. And slower on some single thread applications that rely most of all on clock-speed and uncontested memory bus access.

Would be nice for once to have headlines read something more honest like:

Speed improvements range from -20% for 50% of your software, up to +80% for 10% of your software.

There could even be a nice graph of how much software is improved (or degraded) at each 5% bin of performance. Otherwise it's no more honest than saying that your new Ferrari is capable of speeds up to 220mph, without mentioning that this can only be utilized during .01% of your driving.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?