×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Bulldozer Information and Benchmarks Leaked

timothy posted more than 2 years ago | from the just-a-pinch-of-salt-between-cheek-and-gum dept.

AMD 126

MojoKid writes "With Bobcat and Llano launched, AMD has one more major product overhaul set for this year. The company's Bulldozer CPU will launch in the next few months, and after years of waiting, enthusiasts and IT industry analysts are both curious to see what AMD has in its high performance pipeline. According to recently leaked info, one of the new AMD octal-core processors will be an AMD FX-8130P running at 3.2GHz base speed, with what's reported as a 3.7GHz Turbo speed, and a 4.2GHz clock speed if only half the CPU's cores are in use." Writer Joel Hruska justly points out that measures based on unofficial data and unreleased chips are subject to all kinds of potential errors, not to mention Photoshop.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

126 comments

Photoshop (1)

SquirrelDeth (1972694) | more than 2 years ago | (#36772304)

has what to do with clock speed?

Re:Photoshop (1)

Anonymous Coward | more than 2 years ago | (#36772352)

Means that the images could have been made/doctored in Photoshop.

Re:Photoshop (3, Informative)

VisualD (1144679) | more than 2 years ago | (#36772364)

There is a 3dMark 11 result screenshot with a date of 01/02/2008, they are implying the result is fake.

Re:Photoshop (2, Funny)

SquirrelDeth (1972694) | more than 2 years ago | (#36772392)

My desktop clock is always wrong except for Fedora. Why Fedora? My Suse is 2 hr 31 min off. Why does Linux hate the desktop clock so much?

Re:Photoshop (0)

Anonymous Coward | more than 2 years ago | (#36772410)

Distros normally default to UTC for the hardware clock. Windows uses local time. If you're dual booting, there's your problem.

Re:Photoshop (1)

OrangeTide (124937) | more than 2 years ago | (#36772790)

That doesn't really explain the 31 minutes.

Re:Photoshop (1)

realityimpaired (1668397) | more than 2 years ago | (#36773182)

Newfoundland is in a weird timezone, half an hour off. They're 1.5h ahead of Eastern time, which is 4h ahead of UTC in the summer, so I'm guessing the poster is in NF, which is currently 2.5h ahead of UTC, and that his hardware clock is slightly off.

Re:Photoshop (1)

realityimpaired (1668397) | more than 2 years ago | (#36773186)

2.5h *behind* UTC. gods, it helps to proofread, but it would help to be awake when I post... NF is west of the prime meridian, not east of it.

Re:Photoshop (1)

Anonymous Coward | more than 2 years ago | (#36772826)

Since Vista you can use UTC in the RTC.

Add DWORD HKLM\SYSTEM\CurrentControlSet\Control\TimeZoneInformation\RealTimeIsUniversal
and set it to 1.

Re:Photoshop (0)

Anonymous Coward | more than 2 years ago | (#36776364)

But this requires editing the registry. This is something that eludes most freetards. Many readers here can't even keep their Windows machines from bluescreening daily due to all the misconfiguration and abuse they put them through.

Re:Photoshop (2)

maxwell demon (590494) | more than 2 years ago | (#36776368)

Since Vista you can use UTC in the RTC.

Add DWORD HKLM\SYSTEM\CurrentControlSet\Control\TimeZoneInformation\RealTimeIsUniversal
and set it to 1.

Well, once again the superior intuitive and explorable interfaces of Windows are demonstrated. :-)

Re:Photoshop (1)

mfwitten (1906728) | more than 2 years ago | (#36772422)

Perhaps Fedora but not Suse is running an NTP (Network Time Protocol) client, and perhaps Suse is configured for the wrong time zone or the hardware clock and Suse's configuration don't agree on whether time is stored as local time or as time at UTC offset +0000.

Re:Photoshop (1)

Eskarel (565631) | more than 2 years ago | (#36772428)

If you're running that Linux Desktop in a VM, there's some kernel patches required to fix the clock drift, Fedora probably has them and Suse doesn't.

Re:Photoshop (1)

aztracker1 (702135) | more than 2 years ago | (#36773092)

I usually correct the setting in VMWare, not sure on others... something I've forgotten about until I get my midnight reports in the later morning.

Re:Photoshop (1)

EricX2 (670266) | more than 2 years ago | (#36774992)

But is your clock often 3 YEARS wrong? I've found in the past, stuff goes weird when your clock is that wrong. SSL certificates fail on websites for one.

Also, they are using a new motherboard. Even if the clock was reset it should be the current year. My motherboard from last year doesn't default to 2008, it defaults to 2010 when it is reset.

Re:Photoshop (3, Informative)

rhook (943951) | more than 2 years ago | (#36772414)

Except 3dMark 2011 didn't exist until last December, it's more likely that he never set the BIOS clock.

Re:Photoshop (1)

hairyfeet (841228) | more than 2 years ago | (#36773910)

Well to be fair if this bunch just did a quick Windows setup to run this chip frankly I wouldn't be surprised if they ran something like TinyXP/7 or one of the "Razr1911" Windows builds, all of which seem to have the time service disabled.

But I think the more important thing to note is we won't know what the true performance of the new AMD chips are for about 8 months after the first Bulldozer chips are released. Why is that? because they are currently currently preparing to switch to a whole new APU arch [slashdot.org] where the GPU will be MUCH more tightly integrated as well as a completely new design (from VLIW to a new vector based GPGPU) which will have several benefits. One the FP on the thing will just be insane and double precision will go from 1/5th the performance of single to more than half, two they are going to share the cache between CPU/GPU, three they are gonna have hybrid graphics where the discrete paints the screen and the APU does physics, four completely new SMTP design which drivers are gonna have to be optimized for which is unlike anything AMD or Intel has put out before. As one reviewer I saw put it the new design is "SMT done right" with the extra integer path keeping bottlenecks to a minimum.

So I'd say while this might give a few rough ideas (the review says about in the middle of Sandy Bridge and that is without optimized drivers) with a new arch THIS radical it'll need motherboards and drivers designed for it which simply don't exist ATM. I'm personally glad that AMD is still quite competitive, as after the bribery and compilers scandals I put my money where my mouth was and switched to an AMD only shop and my customers couldn't be happier. They are happy with having low prices on triples and quads that have great integrated GPUs (having HDMI onboard is nice) and I'm happy not to be supporting a company that should have been busted for antitrust years ago. its a win/win and from the looks of things these new chips will just make for more happy customers. I'd say it'll make me happy to but my current AMD quad is fast enough I don't see myself building another personal machine for a good 5 years.

But, (0)

Anonymous Coward | more than 2 years ago | (#36772324)

will it blend?

Re:But, (-1)

Anonymous Coward | more than 2 years ago | (#36772336)

It will blend some frothy shit. And probability burn your penis AMD lover. But at least you are not fucking an Intel.
Sick perverts and their weird fetishes.

Photoshop (0)

Anonymous Coward | more than 2 years ago | (#36772366)

Photoshop & Illustrator seem to be particularly unstable with AMD Phenom X4 Quad-Core Processor 3.4Ghz.

Why the hype? (1, Interesting)

Anonymous Coward | more than 2 years ago | (#36772386)

I don't really understand the hype behind Bulldozer. Do people really believe that it'll be on-par with Sandy Bridge? The $200 2500k competes well with their own $700+ CPU's. That is absolutely ridiculous performance that I wouldn't have dreamed of 5-10 years ago, for that price.

Sure, maybe having more cores will mean better multi-threaded performance, but this still isn't taken advantage of. I don't see Intel losing in the single-threaded department anytime soon.

Re:Why the hype? (4, Insightful)

Mad Merlin (837387) | more than 2 years ago | (#36772432)

Would you rather AMD go out of business and Intel charge $2000 for that $200 CPU?

Blame the sell-outs. (1)

Anonymous Coward | more than 2 years ago | (#36772856)

In 1996, Digital Equipment Corporation had a Alpha processor fabricated in a bad process uncorrected until 1999 that otherwise had the potential to play Doom3 in SOFTWARE RENDERING. Despite the corrected process reaching the same processor, this is the first company ever to reach 1GHz and was done in 1999, but it could've been done in 1996. The $10k workstation, made in America, and still had more potential than AMD and Intel but they were sold-out by Compaq and Hewlet-Packard.

Re:Blame the sell-outs. (0)

Anonymous Coward | more than 2 years ago | (#36773414)

Wow, where can I read more about this?

Re:Blame the sell-outs. (0)

Anonymous Coward | more than 2 years ago | (#36773450)

wikipedia [wikipedia.org] doesn't say anything about a "bad process", but it's clear that Alpha beat Intel compatibles due to better processor design even though intel-compatibles had the edge on manufacturing process.

Re:Blame the sell-outs. (1)

Waffle Iron (339739) | more than 2 years ago | (#36774078)

In 1996, Digital Equipment Corporation had a Alpha processor fabricated in a bad process uncorrected until 1999 that otherwise had the potential to play Doom3 in SOFTWARE RENDERING. Despite the corrected process reaching the same processor, this is the first company ever to reach 1GHz and was done in 1999, but it could've been done in 1996. The $10k workstation, made in America, and still had more potential than AMD and Intel but they were sold-out by Compaq and Hewlet-Packard.

Meanwhile Intel released the Pentium Pro architecture and its successors, which had a price/performance ratio competitive with Alpha, but could run the software people already had. With this, the Alpha was duly relegated to the dustbin of history.

Re:Why the hype? (1)

Verunks (1000826) | more than 2 years ago | (#36773020)

Would you rather AMD go out of business and Intel charge $2000 for that $200 CPU?

so let's hope some benevolent guy buys AMD cpus so I can buy a cheap six core sandy bridge next year

Re:Why the hype? (0)

Anonymous Coward | more than 2 years ago | (#36773850)

You mean doing a "netflix"?

Re:Why the hype? (-1)

Anonymous Coward | more than 2 years ago | (#36772452)

The hype about bulldozer is it can flatten your penis if it gets run over by one . Please people don't lay your shlongs across the road us penis re-constructionists are busy enough with the Lorena Bobbitts of the world. No one likes our job. Would you like to handle penis all day?

Re:Why the hype? (1)

Methuseus (468642) | more than 2 years ago | (#36774010)

I'm sure there are some who enjoy handling penis all day long. Just because you don't, doesn't mean nobody does.

Already proved on the high end (1)

dbIII (701233) | more than 2 years ago | (#36772466)

For some things that run well in parallel their current 4 way 12 core processors released some time ago are better than the newly released Sandy Bridge - so OF COURSE it COULD be better. Whether the consumer CPU is as good or not as good is something that will be worth seeing. If it's nowhere near as good but a lot cheaper that will also be worth seeing.

Re:Already proved on the high end (4, Interesting)

hairyfeet (841228) | more than 2 years ago | (#36774504)

I think most here are missing the forest for the trees. Unless you are a Crysis playing ePeen "must win teh benchmarks!" type AMD doesn't have to win all they have to be is "good enough" which I would argue they currently are and these new chips will simply make it better.

I currently have a Deneb AMD quad as my main home machine and slam the living hell out of it. Video transcoding, using it as a Win 7 DVR, playing games for hours, often WHILE transcoding or recording and you know what? it works great. And I'm a hardcore case, most folks still only do one task at a time, be it gaming, browsing, whatever. Now most importantly I have a machine that will do all that, as well as take a 6 core later on if I wish, with 1.5Tb of HDDs and 8Gb of RAM and an HD4850 and the whole smash, including Win 7 HP X64? Less than $600 after MIR.

And THAT is what matters especially in a dead economy. Folks want a reasonably powerful machine that will last them for years and won't break their wallets and AMD frankly gives them overkill for cheap. I have built fully loaded triples that crank out the video at 1080p all day long for less than $450, quads less than $500 and thanks to how long AMD sticks with sockets if 5 years down the road they decide they want a little more oomph I can pick them up a cheap OEM and just drop it in.

I have found for the jobs the vast majority of folks that walk into my shop have "good enough" was passed with the dual core chips but thanks to AMD for nearly the same money they can go triple or quad which just gives them more years of service without slowdown. Hell the prices are so cheap i built my dad a quad for home. Does he need a quad? Oh hell no, he still single tasks everything like it is 1993! But by going quad I know that no matter how much crap like messenger he ends up running in the task bar he'll never lose responsiveness, and this machine will probably last him the rest of his life.

So unless you are trying to do the super heavy lifting like multiple compiles or hardcore video editing (which I'll admit there is more guys here that do such hardcore CPU pounding than the general pop by a long shot) then all the extra $$$ you spend by going Intel is simply wasted money. I'd say as long as AMD can stay even within a third of the performance of the Intel chips they'll be "good enough" for the vast majority, and having nice low prices simply seals the deal.

Re:Why the hype? (2, Insightful)

c.r.o.c.o (123083) | more than 2 years ago | (#36772486)

I don't really understand the hype behind Bulldozer.

Do people really believe that it'll be on-par with Sandy Bridge? The $200 2500k competes well with their own $700+ CPU's. That is absolutely ridiculous performance that I wouldn't have dreamed of 5-10 years ago, for that price.

Sure, maybe having more cores will mean better multi-threaded performance, but this still isn't taken advantage of. I don't see Intel losing in the single-threaded department anytime soon.

You are still thinking raw CPU power still matters. In a world where even web browsers are 3D accelerated, the GPU suddenly becomes extremely important, even more than the CPU. If you are gaming, the best CPU will still be crippled by the GPU present in that system, and that is what's happening with Intel.

If Bobcat and Llano are any indication, AMD will integrate a GPU that will be at least 2-4 times faster than the GPU in Sandy Bridge while consuming the same amount of power. And if some of the reviews I read are correct, the integrated AMD GPU will be able to work together with the discrete GPU for a 30-70% performance boost.

So if someone buys a very cheap system without a discrete GPU, Bobcat will be faster than Sandy Bridge, and may even be able to play some older games that choke on SB. And Bobcat will be faster for every day tasks as well such as browsing, playing flash movies and games, playing HD content, etc.

Now if someone buys a high end system with a discrete GPU, Bobcat will still be faster, because the integrated GPU will work with the discrete GPU. SB currently does not even have such a feature. Even if it did, SB's integrated GPU is still weaker by far than Bobcat's.

Re:Why the hype? (1)

kevinmenzel (1403457) | more than 2 years ago | (#36772532)

And in the case where raw CPU does matter? You know, like when you're mixing audio or something?

Re:Why the hype? (1)

Ambassador Kosh (18352) | more than 2 years ago | (#36772594)

Why would mixing audio be CPU bound? Wouldn't mixing audio be latency and IO bound?

Re:Why the hype? (1)

kevinmenzel (1403457) | more than 2 years ago | (#36772640)

Not when you're applying effects to the audio. At that point, you need CPU power. Fast CPU power too for each core if you are going to try to keep your latency down - the faster you can do your calculations, the less latency you're adding to the audio path.

Re:Why the hype? (1)

sourcerror (1718066) | more than 2 years ago | (#36772714)

I don't see how sound effects can't be done on GPU. Of course it'd be a lot of time to rewrite all the vst plugins.

Re:Why the hype? (1)

adolf (21054) | more than 2 years ago | (#36772810)

I don't see how sound effects can't be done on a GPU, either. But until that rewrite happens (which it ought to -- it makes too much sense), we'll be still CPU bound for audio tasks.

(Are we discussing today, tomorrow, or the mysterious future?)

Re:Why the hype? (0)

Anonymous Coward | more than 2 years ago | (#36774436)

Ever written code for a GPU? They suck for latency.

The programming model basically runs as follows:
1. Copy the GPU program across the PCI express to the card
2. Copy the input data across the PCI express to the card
3. Run the program on the data.
4. Copy the results back across the PCI express

The GPU cannot access system RAM, so to offload computation from the CPU to the GPU, you must copy the problem onto the card and the answer off the card. GPU's excel in high-throughput computing, where one input has one output, and there exists huge amounts of parallel computation in between. Real-time audio mixing is not like that. You have thousands of inherently fairly serial computations. Each of your computations is small, and you need the results of the intermediate computations, you can't process a whole song and then just get the answer at the end.

By my estimate, in the time it took you to move a sample of audio across the bus and back (not even performing the computation) you could have done it on the CPU several times over.

Re:Why the hype? (0)

Anonymous Coward | more than 2 years ago | (#36776144)

All the drawbacks you mentioned are precisely what AMD try to eliminate with there new fusion architecture. In the future generation of fusion, the CPU and GPU will access the same virtual memory. This would make the call latency of a GPU process as fast as calling a process in an other CPU core.

In the long long term, the X86(_64) legacy will probably lives only in the decoding stage of the chip that supply a massive array of schedulers and execution units.

Re:Why the hype? (1)

Joce640k (829181) | more than 2 years ago | (#36773494)

Why would mixing audio be CPU bound?

Audio programs like FL studio run out of CPU *very* quickly - you can have hundreds of sounds playing, all with effects processing, etc. I tried running it on a netbook once but it had no chance of keeping up...

Re:Why the hype? (1)

Issarlk (1429361) | more than 2 years ago | (#36772730)

Then you buy a SB. But for the vast majority of people an AMD will be enough and cheaper, it's not like Farmville is CPU intensive.

Re:Why the hype? (1)

nyctopterus (717502) | more than 2 years ago | (#36773028)

Flash games aren't CPU intensive? What planet are you from?

Re:Why the hype? (0)

Anonymous Coward | more than 2 years ago | (#36773088)

Probably the one where we've got GPU accelerated flash. Now, from were are you hailing, sir?

Re:Why the hype? (0)

Anonymous Coward | more than 2 years ago | (#36774482)

Planet Microsoft I suppose.

Re:Why the hype? (2)

realityimpaired (1668397) | more than 2 years ago | (#36773244)

Compiling and encoding/transcoding are the only tasks I can think of that are CPU bound, and to some extent both are limited by I/O throughput as well. Most graphics cards have hardware decoders for most common codecs, and most encoding isn't done by consumers. Transcoding usually isn't done by consumers, but I suppose if you're ripping DVD's or something you're doing transcoding.

That said, it takes 15 minutes or so to rip a DVD into a 1GB MKV file on my Core i7 laptop. In other words, we are well beyond the point where most consumers will see CPU speed being a limiting factor in everything they want to do. CPU speeds are actually following a generally downward trend at the moment (except in the enthusiast markets), as the general tendency is towards reduced power consumption and reduced heat, and the realization that processors from 10 years ago were fast enough to surf the web, chat, and write documents. Gaming is really the only mainstream use where the speed is generally trending upwards, and in that market, the power of your video card is far more important than the speed of your CPU... I would rather game on a system with a $200 CPU and a $500 Graphics card than the other way around.

Re:Why the hype? (1)

WuphonsReach (684551) | more than 2 years ago | (#36775628)

processors from 10 years ago were fast enough to surf the web, chat, and write documents

Only if you do one thing at a time, with nothing else running in the background, and you never switch between programs mid-task. And that includes never leaving a web page with Flash ads running.

Otherwise, I think that multi-core becoming cheap around 2006-2007 is the big change. A 1.5GHz single-core CPU is always going to feel sluggish because the CPU is constantly pegging at 100% busy, with no place to put the overflow work. Switch do a dual or quad core CPU, even at a slower clock rate (2 cores @ 800MHz) and responsiveness goes way up.

If my 1.5GHz laptop from 2003-2004 had been multi-core, I might still be using it. But the dual-core Thinkpad that I replaced it with in 2007 is just a whole lot more enjoyable to work with. And I suspect that I won't be upgrading again until 2012-2014.

(SSDs getting cheaper is going to be the next major shift. Once you get used to a system with an SSD, it's really hard to go back to using a 7200 or 5400 RPM hard drive as the primary disk.)

Re:Why the hype? (1)

nabsltd (1313397) | more than 2 years ago | (#36775840)

That said, it takes 15 minutes or so to rip a DVD into a 1GB MKV file on my Core i7 laptop. In other words, we are well beyond the point where most consumers will see CPU speed being a limiting factor in everything they want to do.

Try the same thing with a Blu-Ray where you do any amount of image processing during the transcode. I had one movie where each pass took 10 hours. Admittedly, that's pathological, but it usually takes me about 1.5x real time per pass (so about 6 total hours for a 2 hour movie).

I'm not counting the time it takes to actually rip the Blu-Ray to the hard drive for the transcoding work...that's only about 30 minutes on most movies.

Re:Why the hype? (1)

Rockoon (1252108) | more than 2 years ago | (#36773408)

umm, mixing audio? Are you even aware of what you are talking about?

48khz 5.1 surround sound (6 channels) consumes only 288000 samples per second. Even a fucking 386 could process this, and in fact back in the day 33mhz 386's were playing 16-channel modules (thats software resampling and so forth of 16 independent channels) with enough free time to also do software 3D rendering.

Now, 288000 samples per second on any machine that is between 2ghz and 4ghz yields between 6700 and 13400 cpu cyles per sample PER CORE. In other words, you could probably process 5.1 audio with poorly written and single-threaded vbscript.

Admit it. You knew that you didn't know what the fuck you were talking about, so why did fuck did you bother?

Re:Why the hype? (1)

Joce640k (829181) | more than 2 years ago | (#36773508)

Nope, YOU'RE the one who needs a clue.

Go ask a few musicians if they have enough CPU power for their sequencers (eg. FL studio, Reason, etc).

Re:Why the hype? (1)

Jimbookis (517778) | more than 2 years ago | (#36773672)

I know someone who wrote a popular VST plugin and being a numbers n00b kept hitting the floating point denormal slowdown. I suggested he ditch floats and doubles and just use fixed point with careful scaling in each stage. He hadn't a clue what to do there and suffered the gripes from users about performance. That said, his plugin made wicked sounds. I'd say the musos plugins are somewhat poorly optimised.

Re:Why the hype? (1)

Rockoon (1252108) | more than 2 years ago | (#36773810)

Judging by the responses, I think that you are right. The majority of VST plugins are apparently programmed by code donkeys.

Re:Why the hype? (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#36773542)

He might be one of those mysterious folks who means serious digital audio workstation stuff when he says "mixing audio"... That is, shall we say, slightly more intensive than just shoving around the pre-chewed stuff fast enough for glitchless output from the DAC.

Re:Why the hype? (1)

Anrego (830717) | more than 2 years ago | (#36773600)

Just mixing, maybe not. Effects can definitely chew CPU up though.

I use my desktop (i7 and a tonne of ram) as a guitar amp .. just doing convolution based amp/cabinet modeling gets CPU usage up pretty high. Add in some reverb (also convolution based) and while the box isn't exactly struggling.. it definitely notices.

And that's just one instrument with a limited set of effects.

Re:Why the hype? (1)

Jimbookis (517778) | more than 2 years ago | (#36773702)

WTF? Is you effects software written in Javascript? Convolutions on an i7 should barely wake the CPU up let alone struggle. I think there is some sort of I/O problem instead. Does it use a frickin' spinning while() loop as a wait function for the next sample tick?

Re:Why the hype? (1)

Rockoon (1252108) | more than 2 years ago | (#36773758)

Indeed. Apparently these people are using horribly written VST effects or something.

Even large convolution kernels for 96000 samples per second (DVD quality stereo) should indeed barely wake up an i7 CPU. A machine that WILL execute between 2 and 6 billion instructions per second per thread should not *ever* be struggling with this workload. Hell, you could do a hundreds of large FFT's per second.

Re:Why the hype? (1)

obarthelemy (160321) | more than 2 years ago | (#36773058)

If you're gaming, you're not using an IGP/APU, and won't be anytime soon.

Re:Why the hype? (0)

Anonymous Coward | more than 2 years ago | (#36773676)

actually i hope to be using a IGP/APU to do openCL physics and AI calculations without burdening my discrete GPU with it in the near future.

and actually AMD's APU's do pretty wel at casual 3d gaming, and even perform satisfactory in older 3d AAA games.

Re:Why the hype? (0)

Anonymous Coward | more than 2 years ago | (#36772618)

fuck multi core, I want that 4.2ghz for legacy programs.

Re:Why the hype? (2)

Sulphur (1548251) | more than 2 years ago | (#36773076)

fuck multi core, I want that 4.2ghz for legacy programs.

"Bouncing Babies" at 4.2ghz awesome.

Re:Why the hype? (1)

alci63 (1856480) | more than 2 years ago | (#36772620)

Well... some of us also use computer for, say, databases. With this kind of workload, having several cores is a must. Postgresql for example is on a one process per active request model, and concurrency is impacted by having more cores. Just to give one example. So it may not be significant for gaming, but might be for general computing...

Re:Why the hype? (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#36773610)

Not to mention virtualization:

With that being so common, even the crankiest "this workload is single-threaded, and it really wants a server to itself" applications are likely to find themselves sharing a multicore processor with a bunch of other such workloads.

Given that AMD's server offerings have lately been pretty cheap compared to Intel's, have the advantage of hypertransport being a good interconnect, along with an on-die memory controller(less of an absolute advantage now that Intel has QPI and an on-die memory controller, rather than having to pretend that FSB was good enough; but still quite good for performance), they should do pretty well in VM boxes.

Re:Why the hype? (1)

Damnshock (1293558) | more than 2 years ago | (#36772672)

And maybe the single core performance is way more than enough to most of users?

I'm running a laptop with a Intel Core 2 Duo T7200 and the performance for my day to day use is absolutely satisfactory. I rather add more cpus than raw power per core and have a better multitasking/multi-threaded aproach

Regards

Re:Why the hype? (2)

adolf (21054) | more than 2 years ago | (#36772864)

Heh. I'm running a 1.83GHz Intel Pentium-M on my daily-use laptop. Its performance is absolutely satisfactory, as well, and it just turned 7 years old.

I had the option, recently, of buying a new battery for that machine or buying a new battery for a very similar, just-a-bit-newer Core Duo laptop that I also have (with a far-lesser display), or buying something completely different.

I elected to buy a battery for the old Pentium-M machine: It still does what I want, still feels quick compared to far-faster machines, and works just great for the stuff that actually earns me money.

But I don't mix multi-track audio on it, edit video, or do Serious Computations with it at all anymore (I did all of those when it was new). The hardest work it sees these days is probably when I watch Youtube videos and porn while torrenting the hell out of the hotel's bandwidth when I'm on the road, and it keeps up with that without a fight.

Re:Why the hype? (1)

adolf (21054) | more than 2 years ago | (#36772722)

It's no different than the hype behind the Athlon. Or behind Cyrix/IBM's low-cost 6x86MX. Or AMD's then-fast 40MHz 386 clones in both SX and DX variants. Or even the NEC V20.

Competition is good.

Re:Why the hype? (1)

aztracker1 (702135) | more than 2 years ago | (#36773112)

I don't know.. the Athlon XP was very competative with Intel, and the later Athlon's smoked the Pentium 4's of the time... Core 2 vs. Phenom was a legup to Intel and iX was a further lead to intel.. on the IGP front, I think AMD will lead again here at least for overall performance. Hoping to see more options, the E-350 seems really compelling for its' price class.

Re:Why the hype? (1)

Rockoon (1252108) | more than 2 years ago | (#36773662)

Those 8-core Bulldozer chips essentially have 4 FPU's and 32 ALU's.

So against a 4-core Sandy they wont be able to compete on FPU work simply because Intel invested the space AMD is using for 4 more cores (aka 2 more modules) in kick-ass FPU's. The Bulldozer is going to look like crap next to the Sandy for FPU work. Period.

But against those same 4-core Sandy's, the Bulldozers will likely completely dominate the integer scene. The Bulldozer will (reportedly) turbo to 4.2ghz when only using 4 cores. Even the best Sandy only turbos to 3.8ghz stock, and certainly doesnt do that with all 4 cores under load. Thats 4-threaded integer work at 4.2ghz for BD and only 3.4ghz for SB, or 8-threaded integer work at 3.2ghz for BD and 3.4ghz for SB but the Sandy shares ALU's when using more than 4 cores

So all-in-all, the Bulldozer is going to rock if you dont do heavy FPU stuff, or offload that stuff to the GPU (video encoding is typically offloaded) but the Sandy will still rule of you are doing shit like Raytracing which isnt typically offloaded to GPU's.

Re:Why the hype? (1)

dwillmore (673044) | more than 2 years ago | (#36775670)

According to: http://techreport.com/articles.x/19514 [techreport.com] the peak FLOP should be the same between a BD 'module' and a SNB 'core' if the BD is using FMA4/AVX and the SNB is using plain AVX.

To get maximum performance, you're going to have to code in assembly or use a library that's been coded that way. I expect programs like Prime95 will be first adopters of this.

Supposedly, Haswell (the full tick after IVB) will have FMA3/AVX which should double the FLOP rate and surpass BD, but that's some time out, so we'll have to see what BD does in the mean time. By then, we could see a shrink of BD with more 'modules' or clock speed improvements. Best to worry about those eggs at least until they're laid if not hatched.

Re:Why the hype? (1)

Adayse (1983650) | more than 2 years ago | (#36773514)

There isn't much hype, that is a difference. Attention has shifted to phones. Desktop PCs and the parts that go in them are worthless because they confer almost no social status on their owners, unless the owner is a teenage boy.

Before the battle was about performance, now it's heat, price and performance because every household has a couple of computers per person and the cost of a cpu is much lower.

Re:Why the hype? (1)

Lonewolf666 (259450) | more than 2 years ago | (#36773376)

Which $700+ CPUs?
Newegg.com offers the AMD Phenom II X6 1100T Black Edition for $189.99 right now, and that model is AMD's top desktop CPU right now.

It eats a bit too much energy for my taste (125W TDP), but in price/performance it is a pretty nice CPU. If Bulldozer can improve on the power consumption, my next CPU upgrade will definitely be a Bulldozer.

Re:Why the hype? (0)

0123456 (636235) | more than 2 years ago | (#36775034)

Newegg.com offers the AMD Phenom II X6 1100T Black Edition for $189.99 right now, and that model is AMD's top desktop CPU right now.

It eats a bit too much energy for my taste (125W TDP), but in price/performance it is a pretty nice CPU.

Except an i5 will stomp on it, cost less and use only about 50W of power while doing so. I believe even a dual-core i3 typically beats the non-black edition hexacore Phenoms for not much more than $100.

Re:Why the hype? (1)

Targon (17348) | more than 2 years ago | (#36775590)

Quad-core makes a huge difference when you are busy and doing many things at the same time. Yes, the i5 has a better core design, so is faster in most tasks, and that is why AMD has been hurting, except at the lower end of the market. Intel graphics are horrible, so for your average consumer that will never add a video card to their machine, a $500 AMD based machine will tend to be a bit better than a $500 Intel machine for "total experience". As you go up from there, AMD starts looking worse since you get systems with an Intel processor and an AMD or NVIDIA video card that clearly give Intel the edge.

That is why Bulldozer is so important, because if AMD can get 25% better performance per clock with the same number of cores compared to earlier chips and can do it for less money, that will really help extend the price range for where AMD is competitive. There have been other reports that Bulldozer is able to beat the i7-2500 in overall performance. Even if extra cores are required, if the overall benchmarks give AMD the lead at the same price point, that really will help make for a competitive environment.

When it comes to processor power draw, you also have to compare how the companies measure these things. AMD rates chips on max power draw, not sustained power draw. So a 125W AMD chip may only be drawing 50W, but it can go up to 125W. In this case, we are looking at an eight-core processor, so if you think it will draw the same amount of power as a four-core processor, that's not realistic. We shall see what the real numbers look like when the chips are officially launched.

Re:Why the hype? (1)

Sloppy (14984) | more than 2 years ago | (#36774142)

Do people really believe that it'll be on-par with Sandy Bridge?

We don't know! It might. It might not. Everyone hopes it will.

Whether it wins or loses against Sandy Bridge, one thing's for sure: it's interesting. It sounds like you're still running MSDOS so you don't ever need any parallelism but for the rest of us, AMD has shown the future of hyperthreading and multi-core. It looks like the question of "should we split this piece up so that it can do n things at once?" may be on the table for every part of the CPU.

Re:Why the hype? (1)

Targon (17348) | more than 2 years ago | (#36775634)

You mean multi-threading. HyperThreading is an Intel term for running two threads at a time on one core and "tricking" the OS into thinking there are twice as many cores as there really are. It does help performance in heavily threaded applications, but if you compare eight real cores to four cores that look like eight, and you improve performance so those eight real cores are competitive per-clock with Intel cores, there's a big advantage.

$320 for an 8-core processor that I think starts at 3.8GHz(not the 3.2GHz mentioned in this article), and that's sounding pretty good.

Re:Why the hype? (1)

GreatBunzinni (642500) | more than 2 years ago | (#36774334)

The issue of performance is, by the passing of each day, becoming increasingly irrelevant with the exception of very small niche areas. Meanwhile, for the past half a dozen years the processing power of any low/mid-end desktop processor is quite capable of providing more than enough processing power to take care of any computing need that any regular person may have. Browsing video clips online, browsing social network sites, handling office applications and communications are well-taken care by any processor. There is a reason why the people paid to push hardware and hardware reviews were forced to develop specialized programs to perform artificial, outlandish tests to be used as benchmarks on today's hardware: because the software which people do use doesn't even come near taxing the current hardware.

If there is any doubt then just take a look at the market for desktop workstations: there isn't one. People who need computing power simply pick whatever hardware is already on any shelf on any generic electronics store, install their software and run it. I mean, years ago people paid thousands of euros for a workstation with multiple processors so that they could use CAD software and even specialized number crunching such as those employed in structural analysis programs, and even then the hardware was maxed out. Nowadays, you place the very same CAD software on a sub-500 euros laptop and everything runs smoothly. And finite element method software? In today's low/mid-end hardware it's possible to solve large systems of linear equations with over 30 thousand degrees of freedom in less than a minute, and in software which is single-threaded.

So, again, performance has become largely irrelevant. Without marketing and fanboyism, if someone is faced with the choice of spending either 1000 euros or 200 euros on a processor and the only difference that they will notice is, say, that some task, when run on the 1000 euro processor, takes 2m0s to complete instead of 2m15s, no one in their right mind would spend an extra 800 euros just to get that sort of benefit.

Re:Why the hype? (0)

Anonymous Coward | more than 2 years ago | (#36775084)

It's not Sandy Bridge thats AMD's problem, its intels upcoming Ivybridge and Sandy Bridge - E release they need to worry about. Ivybridge is the die shrink for Sandy Bridge and is apparently 30% more performance and will be the low cost alternative they need to beat (using the same LGA 1155 socket Sandy Brige currently uses and certain chipsets), where as the E release is their 8 core Sandy Bridge processor.

What's your objective here? (1)

tyrione (134248) | more than 2 years ago | (#36772396)

Are you showing that it's a fraud, like the article cited or just to get clicks like your own Headline shows? A bit of both, eh?

How (0)

Anonymous Coward | more than 2 years ago | (#36772398)

How did a story about an article about faked leaked info make it to the front page?

Submitter karma ? (0)

billcopc (196330) | more than 2 years ago | (#36772440)

I think stories, i.e. submitters should have karma. I want to downvote this tripe so bad...

Sac Longchamp Pas Cher (-1, Offtopic)

Carolinehu (2373798) | more than 2 years ago | (#36772506)

Sac Longchamp [sac-longchamp.org] led the fashion step around the world. Girls would like wear Sac Longchamp Pas Cher [sac-longchamp.org] to attend the important occasion or party. The design of Sac Longchamp Solde [sac-longchamp.org] is classic and fashion. With Sac Longchamp, you will be the top of fashion ever.

Sac Longchamp Pas Cher [saclongchamppascher.net] is made of superb material and excellent craftsmanship. So it will never let you down. Sac Longchamp solde [saclongchamppascher.net] is so glaring, you can easily catch the eyes of other, no matter the women or men. In this season, sac a main Longchamp [saclongchamppascher.net] can be your best choice.

ghd lisseur (-1, Offtopic)

Carolinehu (2373798) | more than 2 years ago | (#36772526)

As a girl love beautiful, you must want to have different hair style every day. And lisseur ghd [lisseurghd-france.net] can help you to do this. Ghd lisseur [lisseurghd-france.net] can help you try different style and show your different beauty to others. Ghd is the famous brand around the world. So you can use ghd pas cher [lisseurghd-france.net] without worry.

Octal? (0)

Shag (3737) | more than 2 years ago | (#36772550)

But don't most modern CPUs use hexadecimal?

Re:Octal? (2, Funny)

Anonymous Coward | more than 2 years ago | (#36772656)

Yeah, but that is base 16. So it must therefore allocate two numbers per core. 1st core handles all the 1s and 2s, second core all the 3s and 4s etc and the eight core all the Es and Fs. It makes perfect sense, really.

Re:Octal? (1)

Z00L00K (682162) | more than 2 years ago | (#36772706)

CPU:s are binary, then we use Octal or Hex to represent the contents of the binary structure because it's more convenient.

Re:Octal? (1)

OrangeTide (124937) | more than 2 years ago | (#36772796)

octal never seemed that convenient to me.

Re:Octal? (0)

Anonymous Coward | more than 2 years ago | (#36773958)

DEC used octal because most of their early mini computers had word lengths that would divide evenly by 3. Machines such as the 18 bit PDP15, the 12 bit PDP8, and the 36 bit PDP10. When they made the 16 bit PDP11 they STILL used Octal, but now the most significant digit was ONLY a '1' or a '0'. The instruction word format broke down into three bit fields though so they used octal. The 32 bit VAX was the only machine that DEC used Hexadecimal notation for, it's instruction word broke down into 4 bit fields.

Some early Intel documentation used octal for the 8008 and 8080 processors, again the instruction words break down into 3 bit fields. Eventually though the documentation changed over the hex.

Names (0)

rossdee (243626) | more than 2 years ago | (#36772804)

Bulldozer is not exactly a synonym for high speed, low energy consumption and compact size.

Re:Names (0)

Anonymous Coward | more than 2 years ago | (#36772906)

Yeah, they should have went with Killdozer!

MAX 4.2ghz (0)

Anonymous Coward | more than 2 years ago | (#36772974)

that sucks my 6 yr old box is 3ghz not much better for the price i bet you can get 4 3ghz vs 3 of these

Re:MAX 4.2ghz (1)

realityimpaired (1668397) | more than 2 years ago | (#36773340)

And what are you doing that actually requires that 3GHz? I am currently typing this on an Arrandale-based laptop with a core speed of 1.2GHz and it is plenty fast enough and responsive enough for everything I want to throw at it. If you'd rather wiki the exact specs of my processor, go right ahead. It's a Celeron U3600. I'm not doing any high end gaming on this system (and believe it or not, most computer owners aren't gamers), so it really doesn't need much more oomph than it currently has.

And for the gaming market... how many threads are you running on that 6 year old 3GHz processor? 2 at most? And that's assuming it's a Pentium D with Hyperthreading? I have a year-and-a-half old Core i7 laptop that runs 8 threads at the same time, with a core speed of 2.93GHz. Newer processors can run even more threads. For *most* computer use, it's not the speed that matters, it's the number of threads you can run at that speed.

Re:MAX 4.2ghz (1)

Targon (17348) | more than 2 years ago | (#36775424)

On a high-speed connection, web browsing renders pages faster on a faster processor, you have e-mail, plus a word processor, Quickbooks, all running at the same time comfortably. Just because you do very little with your machine doesn't mean that others use their computers like an overgrown tablet and only do one thing at the same time.

Re:MAX 4.2ghz (0)

Anonymous Coward | more than 2 years ago | (#36773360)

You think the proc in your old box is in any way comparable to a brand new octal core processor?

Cant we have like a geek test or something to post here? Shit... I mean the retards keep creeping in and lowering the overall intelligence of the group.

Re:MAX 4.2ghz (1)

Targon (17348) | more than 2 years ago | (#36775378)

If you read the other responses, you would see that there is a lot of really questionable stuff here that makes the leaked information worthless. There was a known issue in the pre-release Bulldozer cores that crippled performance, which is a big part of why the release was delayed. Now, you clearly have no knowledge of CPU design if you think that clock speed alone is an indicator of performance. Intel has been beating AMD at the same clock speed for a while now due to differences in design, not clock speed. A 2.2GHz Athlon 64 was around as fast as a 3.8GHz Pentium 4 in real-world situations, and many clueless people just couldn't wrap their heads around that idea.

The big question will be what sort of performance improvements the Bulldozer core design has brought to the table compared to previous generations. Since 4-core versions will be available, that would help eliminate core count differences and would set things up for a straight drop-in CPU replacement for benchmarking. Still, if you don't think that a 3GHz processor today is faster than a 3GHz processor from six years ago, you really need to try doing a real-world comparison.

possessive apostrophe (1)

epine (68316) | more than 2 years ago | (#36773292)

This is why company's work hard to control how and when information is shared with the public.

Sometimes you just can't help yourself.

From What's a Metaphor For? [chronicle.com] :

New research in the social and cognitive sciences makes it increasingly plain that metaphorical thinking influences our attitudes, beliefs, and actions in surprising, hidden, and often oddball ways.

Good fit for the mac mini over the i3 or i5 (1)

Joe_Dragon (2206452) | more than 2 years ago | (#36774332)

As the mini needs better then the i3 / i5 on board video and for apple to go from nvidia on board video to intel is a side grade at best.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...