Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

DDR3 Isn't Worth The Money - Yet

Zonk posted more than 7 years ago | from the zoom-but-not-that-much-zoom dept.

Intel 120

An anonymous reader writes "With Intel's motherboard chipsets supporting both DDR2 and DDR3 memory, the question now is whether DDR3 is worth all that extra cash. Trustedreviews has a lengthy article on the topic, and it looks like (for the moment) the answer is no: 'Not to be too gloomy about this, but the bottom line is that it can only be advised to steer clear of DDR3 at present, as in terms of performance, which is what it's all about, it's a waste of money. Even fast DDR2 is, as we have demonstrated clearly, only worthwhile if you are actually overclocking, as it enables you to raise the front-side bus, without your memory causing a bottleneck. DDR3 will of course come into its own as speeds increase still further, enabling even higher front-side bus speeds to be achieved. For now though, DDR2 does its job, just fine.'"

Sorry! There are no comments related to the filter you selected.

I need to get out more (5, Funny)

Gilatrout (694977) | more than 7 years ago | (#20602487)

I read Intel supports Dance Dance Revolution 3.

Re:I need to get out more (1)

nick-whoisnick (1156771) | more than 7 years ago | (#20602519)

i read the same thing...only i overclock my DDR - EXTREME!

Re:I need to get out more (1)

mogwai7 (704419) | more than 7 years ago | (#20603297)

No. You need to get out less.
(Unless you have a DDR machine at home.)

Re:I need to get out more (1)

SQLGuru (980662) | more than 7 years ago | (#20604627)

Dude, I've had DDR for my Xbox for years (daughters).....and it was on the PS2, as well. Who needs to go out to look spastic?

Layne

Re:I need to get out more (1)

Ukab the Great (87152) | more than 7 years ago | (#20603941)

So buy latest Intel processor, jump up and down on it, and report back with your results.

I agree (4, Funny)

TheRealFixer (552803) | more than 7 years ago | (#20602501)

I would have to agree. I see all these kids pumping quarters into these machines and pretending to dance. Seems like a complete waste of money to me.

Re:I agree (1, Offtopic)

Applekid (993327) | more than 7 years ago | (#20602709)

While the DDR joke is fighting hard for its place amongst Soviet Russia and Welcoming Overlords...

You'd be hard pressed to find one player of that genre game to confess they actually consider it dancing any more than whack-a-mole is a simulator of pest-control.

Duh? (5, Insightful)

ynososiduts (1064782) | more than 7 years ago | (#20602515)

Who in their right mind would pay so much for RAM? The only people I can think of are the middle - upper class teenagers with lots of money. The ones who run 8800Ultra's in SLI thinking that 2 cards = twice the performance when it's more like 30 - 50 % increase. Most educated system builders wont spend more money then they have to, and DDR 3 is just overpriced.

Re:Duh? (1)

mr_sifter (1000807) | more than 7 years ago | (#20602561)

Exactly - DDR3 is something like 3 or 4x the price of DDR2. Who really expects it to offer the performance to match that kind of price step-up?

Re:Duh? (1)

Gr8Apes (679165) | more than 7 years ago | (#20602967)

Heck, DDR2 is only now worth it, since it's cheaper than DDR. I have a DDR 400 system at home that's more than 20% OC'd on the memory bus and rock stable. When I bought the 2GB that's in that machine, DDR was about half the price of DDR2.

I'll jump on the DDR2 bandwagon with my next system, unless DDR3 drops to the same or less than DDR2 prices.

Re:Duh? (1)

Poromenos1 (830658) | more than 7 years ago | (#20602571)

What's more, you're spending all that money in a system that will be obsolete in a few months (obsolete for what you want it, anyway) and on top of that, there aren't any games that require that much speed! When the new games that do require it come out, you will have gotten the new latest and greatest. Who needs 140 FPS anyway? Are there even screens that can display it?

Re:Duh? (1)

ynososiduts (1064782) | more than 7 years ago | (#20602631)

If there are I doubt they are available to the general public. So few of those prior mentioned young people don't realize that if they have a 60 - 70 hz LCD monitor it doesn't matter what FPS you get over 70. Some claim to be able to tell the difference between 60 and 90 FPS. Like they can even see it.

Re:Duh? (1)

cnettel (836611) | more than 7 years ago | (#20602733)

The interesting point is the absolute min FPS, measured as 1/(max time between two frames). Even with an average of 70, you can sometimes go above 16 ms between frames. And we're quite sensitive to these things, as an example I notice the somewhat uncanny effect of having two TFTs with different panesl with video scaled over both. They have different response times and possibly different processing, but they should still be at most a (60 Hz) frame or so apart. The effect is nonetheless very visible.

Re:Duh? (1)

plover (150551) | more than 7 years ago | (#20602869)

There's a frequency that seems to be somewhere between 50 and 72 Hz at which the perception of flicker ends for most people. I know that on a CRT a 60 Hz refresh rate is quite bothersome to me but at 72 Hz it's not, while on an LCD screen a 60 Hz refresh rate doesn't bother me at all. This makes me believe my perceptions are related to the overall luminance of the screen ( which is evened out on an LCD by the backlighting, ) rather than the display rate of the bits themselves.

Of course, I'm not playing as many first-person shooters these days, so I don't have the same incentive to install monster-sized video cards as I used to. But when I did, keeping the frame rate above the refresh rate was important to avoid "tearing".

Re:Duh? (1)

Silverlancer (786390) | more than 7 years ago | (#20602881)

That's because LCDs don't flicker, while CRTs do... an LCD wouldn't show flicker at 1hz, it would just refresh slowly.

Re:Duh? (1)

Poromenos1 (830658) | more than 7 years ago | (#20602901)

That's because LCDs don't refresh. They don't have a beam that scans the screen 60 times a second, like CRTs do. Instead, their pixels remain at that value until they are given the signal to change, and the faster that change happens, the faster the screen is. That's why an LCD could be at 1 Hz and you wouldn't notice anything (until the picture changed, anyway).

Re:Duh? (1)

tlhIngan (30335) | more than 7 years ago | (#20604185)

That's because LCDs don't refresh. They don't have a beam that scans the screen 60 times a second, like CRTs do. Instead, their pixels remain at that value until they are given the signal to change, and the faster that change happens, the faster the screen is. That's why an LCD could be at 1 Hz and you wouldn't notice anything (until the picture changed, anyway).


False, actually.

LCDs are refreshed much the same way as CRTs are. You start at the upper left, write the pixel's data, then the next pixel, until you've finished the line. Then you go down the next line and repeat the process.

The thing that makes LCDs not flicker is that the persistence of each pixel is a lot longer than the persistence of the phosphor of a CRT. An active matrix LCD (the kind with a transistor at every pixel) is effectively write-only DRAM memory, with the actual liquid crystal and electrode plates forming the capacitor, and the transistor being the addressing transistor. Like DRAM memory, it needs refreshing, or the image fades. Pull the LCD data cable out and you'll see it fade - make take a few seconds though. The thing is, like DRAM memory, it's easy to write a new value to each pixel - if it's displaying white, I can make it display black trivially by "rewriting" its color value. A CRT screen can't change a pixel that's on to off - all it can do is turn off pixels on. So they have to be very quick to turn off or you get the same "refresh time" issue like LCDs.

Early CRTs ghosted quite like early LCD panels... nowadays the phosphors are much less persistent so they make the image so blurry. As a side effect, the screen flickers much more. Passive LCD screens are much like CRTs...

Re:Duh? (1)

InvalidError (771317) | more than 7 years ago | (#20604889)

Pull the LCD data cable out and you'll see it fade

Not quite: most monitors will display a [pick your color, usually a shade of black] screen for a few seconds upon detecting loss of sync before powering down to standby, thereby clearing whatever happened to be displayed within 20ms of the VGA/DVI/HDMI/whatever cable plug being pulled.

If you really want to see Active-Matrix TFT persistence, you would fare better with yanking the power plug out of the wall socket and shining a very bright light on the LCD. Under these circumstances, the last painted screen sometimes remains distinguishable for 2-3 seconds after the power went out... with backlight, this could be a second or two longer.

Re:Duh? (1)

Mprx (82435) | more than 7 years ago | (#20603959)

My monitor can display 200FPS. (Iiyama Vision Master Pro 454). Some CRTs are even faster.

Re:Duh? (1)

The -e**(i*pi) (1150927) | more than 7 years ago | (#20604369)

Um, I run my monitors at SXGA and 100 FPS with VSchync on and it looks nice, I think the VGA Standard will allow like 160-200 FPS at like 640X480, but until DVI CRTs come out we wont be able to get more than 100 fps on a CRT at SXGA. ALso in games, 200+ FPS will help tremendously with shot registration but LAG is usually the biggest factor in registration once you are over like 100 FPS.

Re:Duh? (1)

EagleEye101 (834633) | more than 7 years ago | (#20603065)

A little off topic, but I read on wikipedia [wikipedia.org] that with the R600 series/drivers the crossfire solution is approaching its theoretical maximum of twice the performance of a single card.

Re:Duh? (1)

JCSoRocks (1142053) | more than 7 years ago | (#20603087)

Actually, with the 8800's the benchmarks I've seen reflect more like a 60% - 80% increase. The new DX10 hardware design makes SLI much more powerful. I've got two 8800 GTSs in SLI and I've seen improvements similar to the 60%-80% I talked about. With older cards, that's obviously not the case, but the newer stuff is pretty impressive.

Re:Duh? (1)

ynososiduts (1064782) | more than 7 years ago | (#20603139)

That's really only useful if you are using super high resolutions or multi monitor. My 8800GTS (320 MB version) can play any game at my monitors native 1680x1050 with everything maxed out at 40 FPS. Which is the minimum I would go.

Re:Duh? (1)

mogwai7 (704419) | more than 7 years ago | (#20603317)

SLI can only be enabled while using a single monitor.

Re:Duh? (1)

Orange Crush (934731) | more than 7 years ago | (#20604375)

SLI can only be enabled while using a single monitor.

Well, with two monitors I can't think of a reason to do SLI since you can have each card powering one whole monitor.

Or technically illiterate people (4, Insightful)

Moraelin (679338) | more than 7 years ago | (#20603461)

Some, oh, I think 6-7 years ago, I happened to be at the local computer store, to buy some stuff. (In the meantime I buy most components online, so that's not to say it hasn't happened ever since, just that I wasn't there to see it.)

So an older guy came and said he wants them to build him a system. He was pretty explicit that he really doesn't want much more than to read emails and send digital photos to his kids. You'd think entry level system, right? Well, the guy behind the counter talked him into buying a system that was vastly more powerful than my gaming rig. (And bear in mind that at the time I was upgrading so often to stay high end, that the guys at the computer hardware store were greeting me happily on the street. Sad, but true.) They sold him the absolute top end Intel CPU, IIRC some two gigabytes of RAM (which at the time was enterprise server class), the absolute top-end NVidia card (apparently you really really need that for graphical stuff, like, say, digital photos), etc.

So basically don't underestimate what lack of knowledge can do. There are a bunch of people who will be just easy prey to the nice man at the store telling them that DDR3 is 50% better than DDR2, 'cause, see 3 is a whole 50% bigger than 2.

And then there'll be a lot who'll make that inferrence on their own, or based on some ads. DDR3 is obviously newer than DDR2, so, hmm, it must be better, right?

Basically at least those teenagers you mention read benchmarks religiously, with the desperation of someone whose penis size depends (physically) on his 3DMark score and how many MHz he's overclocked. If god forbid his score fall 100 points short of the pack leader, he might as well have "IMPOTENT, PLEASE KILL ME" tattooed on the forehead. At 1000 points less, someone will come at a door with a rusty garden scissors and revoke his right to pee standing. So they'll be informed at least roughly what difference does it make, or at least if the guys with the biggest e-penis are on DDR2 or DDR3.

I worry more about moms and pops who don't know their arse from their elbow when it comes to computers. Now _normally_ those won't go for the highest end machine, but I can see them swindled of an extra 100 bucks just because something's newer and might hopefully make their new computer less quick to go obsolete.

Re:Duh? (1)

jbeaupre (752124) | more than 7 years ago | (#20603553)

I agree, but it's not trial becoming an educated system builder. Back when Computer Shopper could kill small pets if dropped and I read it monthly, I could build a system no problem.

Fast forward a decade, my career is different and I'm not as well informed. But I need to build a specialized system. For FEA on large problems (100,000+ nodes) you need masses of fast RAM. Fast everything. But I own a big chunk of my business and have to pinch pennies or it comes out of my pocket. Even if I go to someone to build it for me, I better know what I'm asking for.

So now I've got to figure out how to balance cost against speed. But then throw in the sometimes large chunk/sometimes small chunk memory access for long periods with gigabytes of ram and do I go for latency, bandwidth, etc, etc? So I go a researchin'. Lot's of information, but no clear conclusion. In the end, I may end up paying through the nose for RAM "just in case." My business depends on it.

So far it looks like DDR3 is ill suited to FEA by its very nature. It'd stall the processor too much. So if I overpay, it will be for tried and true low latency, fast RAM suitable for overclocking but not to provide stability and avoid buying registered RAM.

Re:Duh? (1)

pilgrim23 (716938) | more than 7 years ago | (#20604287)

Does anyone recall when FPM DRAM cost a 2nd mortgage for 64K? Memory has ALWAYS been expensive. Most of that cost is hype and nonsense but it still coems out of my pocket...

Re:Duh? (1)

vil3nr0b (930195) | more than 7 years ago | (#20605459)

"The only people I can think of are the middle - upper class teenagers with lots of money"- I beg to differ....government entities buying new clusters along with the same hardware enthusiasts who bought the Asus A7A266 which had the option for SDRAM OR DDR266. Sure there are early chipset issues, but they are easily remedied in the future.

Ad-free! (3, Insightful)

InvisblePinkUnicorn (1126837) | more than 7 years ago | (#20602523)

I'm so used to crap like c|net that I immediately went searching for a "printer-friendly" (aka, ad-free) version of the article, but lo and behold, that's not necessary. To think, I could actually read an article online without having to navigate through the usual nightmare... what an intriguing concept!

Re:Ad-free! (0)

Anonymous Coward | more than 7 years ago | (#20604241)

Firefox extension idea : automatically redirect to the printer version of such pages. Should exist, if it doesn't already ...

AMD (1)

eknagy (1056622) | more than 7 years ago | (#20602569)

Yawn.
Well then, we have to wait for AM2+ to become available, and with the new AM2+ Barcelonas, it will worth the money.
Reminds me RDRAM...

Slashdot (-1, Troll)

Anonymous Coward | more than 7 years ago | (#20602579)

I keep forgetting this is slashdot. Let me make sure everything conforms. Love Linux hate anything Microsoft (n/a for this story) Hate all Republicans love all Democrats (n/a for this story) If it is free love it, if not hate it unless it is cool (n/a for this story) If it is made by Mac and not being compared to Linux Love it (n/a for this story) If it comes with the source love it, if not hate it (n/a for this story) If it is about Intel hate it, if it is AMD love it (Check) Ok this is a good Slashdot Story. Flame away :)

Re:Slashdot (1)

ynososiduts (1064782) | more than 7 years ago | (#20602729)

You obviously haven't been around lately. There is a lot of love for Intel Core stuff. Intel has open source drivers for a lot of their hardware too. So there really is no Intel hate.

Re:Slashdot (1)

somersault (912633) | more than 7 years ago | (#20603551)

Yep, there's just poor product hate.

Re:Slashdot (1)

Joe The Dragon (967727) | more than 7 years ago | (#20603815)

So does ATI / AMD and they have real video cards with there own ram.

Could someone enlighten me the power consumption.. (0)

Anonymous Coward | more than 7 years ago | (#20602585)

Could someone enlighten me the power consumption of these RAM nowadays? Do we need something interruptive now?

Why do these reviews only focus on one thing? (5, Interesting)

downix (84795) | more than 7 years ago | (#20602587)

Every time I see "the need isn't there" or "there's more than enough memory bandwidth" I check their figures, they're only measuring the CPU memory needs. Well, hate to break it to you, but there's more to a computer than just the CPU. Having that extra bandwidth means that those lovely PCI Bus Mastering devices (such as my SCSI 3 controller, and quad firewire card) aren't fighting with the CPU for memory access. Frankly, add in a game accelerator like the Phys-X and a high-end GPU fetching data from the main memory for local cache, and even DDR3 starts looking a bit narrow....

Re:Why do these reviews only focus on one thing? (1)

JCSoRocks (1142053) | more than 7 years ago | (#20603121)

Did you just say Phyx-X? Don't you know that those things are only good for working on your trash can hook shot from your desk chair? At least... I haven't read any reviews that say otherwise.

Re:Why do these reviews only focus on one thing? (5, Insightful)

Slashcrap (869349) | more than 7 years ago | (#20603521)

Having that extra bandwidth means that those lovely PCI Bus Mastering devices (such as my SCSI 3 controller, and quad firewire card) aren't fighting with the CPU for memory access.

With a SCSI 3 card and 4 port Firewire you'd be looking at about 360MB/s of bandwidth assuming that they reach their max theoretical speed (and of course PC hardware always reaches its maximum theoretical speed). Unless they're both on the PCI bus in which case 133MB/s max for both. Which is fairly minor compared to the 6GB/sec of memory bandwidth that I get with shitty DDR2 on a shitty motherboard.

Unless you can provide evidence to the contrary, I am going to go out on a limb and suggest that the performance increases you are expecting do not actually exist. Unless your primary workloads involve running memory benchmarks and Prime95 in which case I would point out that you accidentally posted to Slashdot instead of the Xtremesystems forums.

Re:Why do these reviews only focus on one thing? (2, Informative)

julesh (229690) | more than 7 years ago | (#20603735)

Every time I see "the need isn't there" or "there's more than enough memory bandwidth" I check their figures, they're only measuring the CPU memory needs.

The reason they're only measuring the CPU memory needs is becase the CPU memory needs dwarf all others.

Max CPU memory access rate (Intel Core 2 @ 1333FSB) = 10.7 GB/s
Max PCIe memory access rate (16 lanes @ 2500MH/z) = 4 GB/s

Total 14.7GB/s over 2 lanes of memory = 7.35GB/s ~= 1800MHz. So, if both your CPU and your I/O devices are running at 100% capacity on a current high end system, you might benefit from DDR3 memory (2GB for £406 from my usual supplier). If, however, you can put up with not using 100% of your CPU capacity when you need to use your I/O capacity (I think most people can, you know) you can get 10.7GB/s with DDR2/667, (2GB for £56 from my usual supplier).

I don't see why the faster memory is worth paying enough extra that I could buy an entire extra computer instead, when I will only use it in the rare case I'm maxing out both I/O bandwidth and CPU bandwidth.

Re:Why do these reviews only focus on one thing? (1)

unfunk (804468) | more than 7 years ago | (#20604951)

I don't see why the faster memory is worth paying enough extra that I could buy an entire extra computer instead, when I will only use it in the rare case I'm maxing out both I/O bandwidth and CPU bandwidth.

Agreed.

When I buy memory, it's always the best value stuff that I get. Do I get 1GB DDR2-533 (no-name brand) for $55au, or 1GB DDR2-800 (Corsair brand) for $140au?
Gee... it's a tough choice, but I think the no-name stuff is the goer here...

Re:Why do these reviews only focus on one thing? (1)

InvalidError (771317) | more than 7 years ago | (#20605695)

Pretty much all new memory technologies have been historically ridiculously overpriced for the first many months following their initial introduction.

It takes a while for people to adopt new memory technologies because they do not want to pay the full introductory price. It takes a while for manufacturers to ramp up production because they do not want to end up with excessive inventory caused by slow initial uptake. It takes a while for new technologies to become mainstream but it will happen in due time as it gains traction on both consumer and manufacturer sides so the market can find its equilibrium.

Short-term (now), DDR3's main advantage is lower operating voltage and power. Medium-term (late 2008), DDR3 will enable migration of low-cost systems from DDR2-667 to DDR3-1066. Long-term, DDR3 will enable penny-pinchers to wait until DDR4 becomes mainstream and overclockers to brag about their DDR3's lower latency compared to bleeding-edge DDR4... the exact same story we had on the way from DDR to DDR2.

Re:Why do these reviews only focus on one thing? (1)

julesh (229690) | more than 7 years ago | (#20605849)

Pretty much all new memory technologies have been historically ridiculously overpriced for the first many months following their initial introduction.

Well, yes. But as the title of this article is "DDR3 Isn't Worth The Money - Yet", I don't see anyone disagreeing with this. The point is that it isn't worth it, for the vast majority of people, to buy this technology if they're upgrading their computers right now.

They never learn. Technology marches on. (2, Insightful)

Rod Beauvex (832040) | more than 7 years ago | (#20602635)

60ns SIMMs ought to be fast enough for anybody.

In a year's time, DDR3 will have totally supplanted DDR2.

Re:They never learn. Technology marches on. (1)

Ginger Unicorn (952287) | more than 7 years ago | (#20602769)

when the price comes down. i think thats the key point being made. of course faster is better. just not if it costs a stupid amount of money. so when ddr3 costs the same or not much more than ddr2 then it will indeed become an attractive proposition.

Re:They never learn. Technology marches on. (1)

russlar (1122455) | more than 7 years ago | (#20602877)

60ns SIMMs ought to be fast enough for anybody.

people used to say the same thing about 64k.

Re:They never learn. Technology marches on. (0)

Anonymous Coward | more than 7 years ago | (#20603451)

>60ns SIMMs ought to be fast enough for anybody.

You know the funny thing is that the internal DRAM cell performance hasn't improve that much from your 60ns async DRAM days. If you look up the latency time of synchronous dram and works backward, you'll come up with similar parameters as the old memory. I'll be surprise if it is even 1/2 latency (tRAS, tCAS)

The only difference is how it is connected to the outside world and increased internal memory banks to have parallelism.

Re:They never learn. Technology marches on. (1)

timeOday (582209) | more than 7 years ago | (#20605097)

There's a huge difference between "you don't need anything faster" vs. "DDR3 is not faster."

Re:They never learn. Technology marches on. (1)

Mattsson (105422) | more than 7 years ago | (#20605209)

In a year's time, DDR3 will have totally supplanted DDR2.
Which is exactly what the post said:

DDR3 will of course come into its own as speeds increase still further

Didn't this happen before? (2, Informative)

xx01dk (191137) | more than 7 years ago | (#20602673)

Anyone remember when DDR2 was rolled out and was actually *slower* than the standard of the day, regular DDR? It took about a year IIR for the speed of the newer ramm to catch up and overtake the older ram, and even then it was still pricey. I expect with the current glut in the market of DDR2 that it will take quite a while for DDR3 to be considered a worthy upgrade.

Re:Didn't this happen before? (0)

Anonymous Coward | more than 7 years ago | (#20602911)

I remember that myself. It's also funny how AMD upped their processor ratings by 200 Mhz on the DDR2 chips. lol!

Heck, I just bought 2 GB of DDR for my box this month. So I'm definitely not worried about DDR3.

Let me know when the memory bus starts running at 300 or 400 Mhz. (actual speed, not double data rate nor quad pumped. lol)

Re:Didn't this happen before? (5, Interesting)

Zephiris (788562) | more than 7 years ago | (#20602999)

Part of the reason that DDR2 was so much slower at most clockspeeds is because of the added latency. The lower speed DDR2 can have more than twice the tested latency of DDR400. The problem is that apparently both JEDIC, or whoever standardizes memory now, isn't thinking of what is the best direction for DDR to take. They're going in the same direction as the manufacturers, trying to sell higher "Megahertz" and "gigabytes per second" ratings, even when they're effectively meaningless now.

Does it exactly matter if your computer can do 6GB/s, or 12GB/s? 14GB/s? Where does it stop? And even then, that's mostly theorhetical, particularly in the case of DDR2. But a very important distinction is that so many memory accesses are of very small to small size. On basically all of those accesses, the memory request will be served in far less time than the latency will allow the command to return and allow another request.

Way back when, Intel motherboards tried out RDRAM for its 'higher end' boards, and the Nintendo 64 also started using it. Both were fairly large fiascos, in that sense, with more or less all technical reviews noting that the increased latency more than cancelled out the improved bandwidth. Now we're looking at DDR3, with far higher latencies than classic RDRAM for a relatively minor bandwidth improvement that only extremely large memory requests (such as applications that would theorhetically be done in an extremely large-scaled database and scientific research).

It reminds me acutely of the early 'Pentium 4s'. A 600Mhz Pentium 3 could beat up to a 1.7Ghz Pentium 4 in most applications and benchmarks, and the (rare and expensive) 1.4Ghz Pentium 3s were real monsters. But people kept trying to tailor benchmarks to hide that, so people would buy more product.

Overclocking has also generally demonstrated that overclocking regular 'old' DDR1, while a bit pricier (mostly due to the virtual elimination of production nowadays, though), scales better and also has far better numbers than DDR2 and the like. DDR600 equivalent is extraordinarily zippy, and (of course) real-world latency is also absurdly low.

It makes me feel like the 'governing bodies' here have really let people down. Instead of trying to standardize on and promote what's best for general computing, they're trying to push a greater volume of merchandise that has no meaningful improvement, and in fact usually a notable decline, over what we've already had for years. The bottom line for them is money, and that's just wrong to put their own pocketbooks over the long term well-being of computing technology and the needs of the consumer.

Re:Didn't this happen before? (1)

imgod2u (812837) | more than 7 years ago | (#20603447)

The increased latency means a larger problem but the argument is that the aggregate improvement over time is better. That is, there was no further way to improve standard DDR other than to start dual or quad-channeling it (making 512-bit buses on the motherboard). There is a clear frequency hit unless you start increasing latency and pipelining memory accesses. There is a penalty, yes, and with latency-sensitive applications that does a lot of pointer-hopping, it can mean that the application will actually be slower on DDR2 than the original DDR memory.

This isn't some indication that people who make these designs "just wanted higher numbers". That's really a short-sighted way to look at it. You might as well argue that going multi-core is "just to boast a higher number of cores" since it doesn't improve on current, predominantly single-threaded applications.

As we move into the future, the ability of hardware to continue to improve independently of how software is written as well as how the rest of the system jives with it will become more and more limited (Netburst proved this). Software and hardware will be tied closer and closer together and *both* will have to change for performance increases.

In the case of multi-core, this means software will have to be multi-threaded. In the case of high-latency, but high-bandwidth memories, this means that software will have to do much less pointer-hopping and allow caching systems to be able to better hide latencies to memory while utilizing the higher-available bandwidth. Remember, not all small, short memory accesses will be affected by higher memory latency, only those that are unfriendly towards caching.

I honestly don't see why the ISA's of microprocessors don't allow direct control of the cache. If not the L2 cache for speed reasons, at least some kind of L3 cache. Make instructions that will allow software to allocate certain regions of cache to certain memory address spaces (CPU will translate and cache). If I have 4MB of L3 cache available that I can control directly, it's quite trivial to make a pixel processor, for instance, that pre-loads 3MB of pixel data and 1MB of meta-data that's needed (like coefficients) and run the loop (interleaving load and calculation blocks so that there won't be any downtime). That kind of scheme would normally cause caching systems to die because the meta-data is stored in a different region of memory than pixel-data. This would also eliminate the penalty (depending on how the L3 cache is designed) of unaligned memory accesses in the main calculation loop.

Re:Didn't this happen before? (0)

Anonymous Coward | more than 7 years ago | (#20605087)

But the latency is more a materials science issue than a electrical engineering issue. It is much harder to improve.

Re:Didn't this happen before? (1)

Agripa (139780) | more than 7 years ago | (#20605597)

Part of the reason that DDR2 was so much slower at most clockspeeds is because of the added latency. The lower speed DDR2 can have more than twice the tested latency of DDR400.


It is not quite that simple.

The latency is ultimately limited by the characteristics of the DRAM array which has a specific access time after the row and column addresses are provided. When you compare the latencies of DDR to DDR2 or DDR2 to DDR3, you need to take into account the interface clock speed. Internally, DDR-400, DDR2-800, and DDR3-1600 all run at the same clock speed but use proportionally higher external clock speeds meaning that 3 cycle latency DDR-400, 6 cycle latency DDR2-800, or 12 cycle latency DDR3-1600 all represent a period of 15 nanoseconds. You might notice that the actual access times have only improved by a factor of about 2 (80 to 40 nanoseconds) since the days of FPM DRAM.

Re:Didn't this happen before? (0)

Anonymous Coward | more than 7 years ago | (#20606111)

Part of the reason that DDR2 was so much slower at most clockspeeds is because of the added latency. The lower speed DDR2 can have more than twice the tested latency of DDR400. The problem is that apparently both JEDIC, or whoever standardizes memory now, isn't thinking of what is the best direction for DDR to take. They're going in the same direction as the manufacturers, trying to sell higher "Megahertz" and "gigabytes per second" ratings, even when they're effectively meaningless now. Does it exactly matter if your computer can do 6GB/s, or 12GB/s? 14GB/s? Where does it stop? And even then, that's mostly theorhetical, particularly in the case of DDR2. But a very important distinction is that so many memory accesses are of very small to small size. On basically all of those accesses, the memory request will be served in far less time than the latency will allow the command to return and allow another request.

One of the advantages to DDR2 and DDR3 is the high standard latency. Yeah, that sounds wrong, but it's the truth, because those numbers can be lowered using "performance" memory.

Today, I can buy DDR2-800 RAM with timings of 4-4-3-5. The equivalent in DDR would be DDR-400 with timings of 2-2-1.5-2.5, which just doesn't exist, and would be prohibitively expensive if it did. Lowering any aspect of latency from 3 to 2 is much harder than lowering from 6 to 4, and lowering from 9 to 6 (for DDR3) is still easier.

This is even reflected in the standard timings, where DDR-400 is 2.5-3-3-8, while DDR2-800 has 5-5-5-15, which are just a tiny bit faster. And, faster timings are available for almost no premium: 4-4-3-5 DDR2-800 is $50/GB. Although there is no DDR3-1200, DDR3-1600 has timings of 7-7-7-18 as some of the least expensive memory today. I have no doubt we will see timings of 5-5-5-12 or so by the time it hits mainstream. That would be like DDR2-1066 having timings of 3.5-3.5-3.5-8, which is much better than what you can buy today, for any price

Re:Didn't this happen before? (0)

Anonymous Coward | more than 7 years ago | (#20606377)

DDR3 at 1333 MHz is essentially the same performance as DDR2 800. It just has twice the bandwidth(which nothing takes advantage of at the moment).

now, some real magic begins to happen when you step up to the 1800 MHz DDR3 that is already available based on Microns Z9 chips. OCZ already has some real doosies out, check out this review:

http://www.anandtech.com/memory/showdoc.aspx?i=3053 [anandtech.com]

or heres the comparison numbers:

http://www.anandtech.com/memory/showdoc.aspx?i=3053&p=4 [anandtech.com]

DDR2 held it's own until the buss speeds were cranked up. the new 1800MHz DDR3 went all the way up to 2000MHz achieving a 38.28sec superpi score compared to the 45.20 sec best that DDR2 could throw out.

yeah, yeah, i can hear all you snots saying that it doesn't have anything to do with the price of rice in china. Think about it this way, if it can shave almost ten seconds off of a 5 minute long calculation, imagine what it can do to compile times. might have to cut your coffee break a little shorter!!

most times progress comes in little drops instead of floods.

Hasn't come out on Wii yet (1)

pembo13 (770295) | more than 7 years ago | (#20602689)

That's just because it hasn't gotten out for Wii just yet.

Yeah, lots of new technology... (1)

mdm-adph (1030332) | more than 7 years ago | (#20602693)

...isn't worth it when it's brand new. Give it a while for the price to come down.

This is... (0)

Anonymous Coward | more than 7 years ago | (#20602747)

...news?

Seriously, I don't want the sky to fall or anything, but sometimes no news is better than "it was obvious" news.

"Fast" DDR2 isn't just for overclocking (3, Interesting)

A Friendly Troll (1017492) | more than 7 years ago | (#20602775)

Intel's C2Ds love their memory bandwidth. Even the extreme low end, such as the E4xxx, can profit from something like DDR2-800 and an asynchronous 1:2 FSB:RAM. The E6xxx with their 266 MHz FSB can run at 2:3 with DDR2-800 and perform better than with 1:1 and slightly lower latencies.

Besides, the price difference between DDR2-533 and DDR2-800 is really small. You might as well go for it, if only for futureproofing your system.

Re:"Fast" DDR2 isn't just for overclocking (3, Insightful)

Corporate Troll (537873) | more than 7 years ago | (#20602903)

There is no such thing as "futureproofing" a computer. I thought that once too, and spent ridiculous amounts of money on computers that should last very long. They did, but while I could run most future programs well and fast, the people I knew bought a new computer for much cheaper that did the same stuff faster than my futureproofed machine. In the end buying more PCs, for less money. While they had 3 machines over that time, and I only one, they always had the faster machines except for the first 6 months where my machine was so overpowered that it was insane.

Look at the people that bought the first DX10 graphics card in order to run Vista and play DX10 games. Microsoft has already revised the DX10 "standard" and obsoleted these cards.

Futureproofing in computing is not a good idea. Perhaps in servers, yes, but in desktops... No way.

Re:"Fast" DDR2 isn't just for overclocking (1)

A Friendly Troll (1017492) | more than 7 years ago | (#20604033)

Well, "futureproofing" in this context means replacing your E4400 @ 200 MHz FSB with a new quad-core Penryn that has an FSB of 333 MHz. It would be a noticeable upgrade for gaming, development, video encoding, etc. I agree with you that trying to buy the latest and greatest is a bad idea; my old PC lasted since 2000, and it was only replaced this year with medium-range components.

Anyway, the point is that if you buy that Penryn, your "good enough" DDR2-533 (266 MHz FSB) you bought with the E4400 isn't guaranteed to work as DDR2-667 needed for the new CPU. If you have an "overkill" DDR2-667, it'll feel right at home with the Penryn... Except that you might even be able to get higher performance with a 4:5 ratio and a "totally unneeded overkill for the E4400" DDR2-800 working at 416 MHz FSB (doable for most memories).

I've just checked DDR2 memory prices where I live. Believe it or not, DDR2-800 is *cheaper* than DDR2-667, and I can only find one vendor with DDR2-533. So, if you are buying a DDR2-based system these days, the choice is clear. Actually, there's no choice.

P.S. We need more "Trolls" here :)

Re:"Fast" DDR2 isn't just for overclocking (0)

Anonymous Coward | more than 7 years ago | (#20604093)

20% decrease in memory access time with DDR3 and the new Supermicro CS2BX workstation motherboard. Benchmark results coming in 1 week time on http://www.supermicro.com/ [supermicro.com]

bitcH (-1, Flamebait)

Anonymous Coward | more than 7 years ago | (#20602813)

yo0're told. It's

To every season, turn turn turn (2, Insightful)

Applekid (993327) | more than 7 years ago | (#20602839)

I remember the same discussion when DDR2 was hitting stores.

Re:To every season, turn turn turn (1)

Penguinisto (415985) | more than 7 years ago | (#20604057)

I remember the same discussion when DDR2 was hitting stores.

...or when PC-133 SDRAM first came out. Or when 72-pin DIMMs first came out. Or when you could stuff 4MB onto a 286 instead of just 1 or 2.

Each step was nic,e, but hampered by the tech that used those parts (e.g. DOS and its apps were still fighting each other between EMS and XMS for using anything over 640k, back when boxes started coming out with 1, then 2MB of RAM on 'em).

...and don't get me started on how frickin' worthless that 512k RAM cartridge turned out to be on my old Commodore 64. ITt took forever just to set the thing up and load the handful of programs that made use of it (for instance, 20 minutes just to load and see a hiighly crude but realtime-rendered spinning Earth on the screen...)

But, never fear - I'm sure the next version of Vista (or perhaps even its service packs) will demand every ounce of bandwidth that DDR3 can give and then some...

/P

It's The Drives, Stupid (2, Insightful)

maz2331 (1104901) | more than 7 years ago | (#20602927)

Really, memory and CPU bottlenecks are not the biggest issue right now. The problem is and has been storage speed. It doesn't matter if we can crunch bits faster on the mainboard if we can't get them in and out to begin with. Memory and CPU speeds are skyrocketing and hard disk performance has stayed rather flat for years. Until drive performance catches up we'll still be waiting forever for the OS to boot up or apps to load.

Re:It's The Drives, Stupid (1)

imgod2u (812837) | more than 7 years ago | (#20603545)

That may be for things like application boot-up and OS boot-up time but I don't think those things are a priority for speed-up. Most applications now-a-days can run almost entirely out of RAM (and store their data-sets in RAM). 2GB of memory is not uncommon. This makes memory speed predominant in limiting the speed of a computer in most applications.

Having photoshop filters run faster or your iTunes transcode your "collection" of Simpsons episodes so you can play it on your iPod are all things that are computationally and memory bound.

Re:It's The Drives, Stupid (1)

ypps (1106881) | more than 7 years ago | (#20604047)

Photoshop filters and video transcoding are predictable interruptions that come in big chunks. You can then use your other core(s) to do other tasks, for example to load some files from HDD to RAM...

Opening photos for editing in Photoshop and opening video files for transcoding are tasks that are limited by HDD-performance. This HDD lag comes in tiny bits all the time. You can't avoid it. It's also ANNOYING and pisses you off (some call it "micro stress"). Let's say you lose 10 seconds every two minutes that you are working on the computer. That's 5 minutes every hour that you are just waiting. It adds up if you spend a lot of time working on the computer. Maybe, 20 minutes per day? Would your boss be happy if you suddenly started taking an extra 20 minute coffee break every day?

Once someone figures out a cheap way to get around this problem (cheaper flash memory drives, cheaper RAM-drives, cheaper fast HDDs ,or something else) our productivity will increase by several percent.

Re:It's The Drives, Stupid (1)

imgod2u (812837) | more than 7 years ago | (#20605449)

Considering I'm writing this from work. I don't think computer speed's the limitation to my productivity.

And while HD lag is annoying, the concern for most computational limits, IMO, has been with processing heavy workloads (simulation time, gaming, processing filters, etc.) The actual time it takes to load a picture from HD is quite trivial compared to waiting 5 min for a black-and-white filter.

Not so sure... (1)

Nicolay77 (258497) | more than 7 years ago | (#20605793)

You keep telling that to yourself.

I want my Half-Life 2 levels to load faster.

co3k (-1, Flamebait)

Anonymous Coward | more than 7 years ago | (#20602929)

members aal over First, you have to

Spread over 14 pages to serve you more ads (TM) (0)

Anonymous Coward | more than 7 years ago | (#20602949)

The "article" if it can be called that is split up into 14 separate pages. Why? To serve you more ads as you attempt to read through it of course.

Cue the detractors (1)

dazedNconfuzed (154242) | more than 7 years ago | (#20602961)

Nearly every incremental step in technology is met with a barrage of "it's too expensive, it doesn't work right, it's not worth it, nobody will go there..." at which point it goes on to become the norm.

Is It Just Me? (1)

Tyler Eaves (344284) | more than 7 years ago | (#20603019)

Or does anyone else have trouble taking a site called "TrustedReviews" seriously?

Re:Is It Just Me? (1)

doombringerltx (1109389) | more than 7 years ago | (#20603153)

I trust them

Re:Is It Just Me? (1)

mogwai7 (704419) | more than 7 years ago | (#20603339)

I don't trust anything with 'trust' in the name/brand.

4X4 (1)

TheSHAD0W (258774) | more than 7 years ago | (#20603041)

Would DDR3 be worthwhile in a system with two quad-processors installed? I'm sure that'd load down the bus pretty heavily...

Re:4X4 (2, Interesting)

ZachPruckowski (918562) | more than 7 years ago | (#20603213)

On a dual-processor Intel machine, you have to move to FB-DIMMs. I'm not sure if there are currently DDR3 FB-DIMMs, but I don't think so. If there were DDR3 FB-DIMMs, they'd also be quad-channel.

On a dual-processor AMD machine, you have NUMA (non-uniform memory architecture), so each each processor (processor, not core) has its own set of memory and its own bus, meaning you have 2 dual-channel busses.

same old story (1)

Cutie Pi (588366) | more than 7 years ago | (#20603173)

Is it just me or does it seem like every new memory technology disappoints? I've built systems since before EDO-DRAM was all the rage, and we've seen lots of advances since... Burst EDO, SDRAM, RDRAM, DDR, DDR2, DDR3... but every time one of these supposed breakthroughs debuts, the review sites quickly go to work and reveal (at most) 5-10% performance increases over the previous generation. Often it's in the 1-2% benefit range. It seems like its very difficult to squeeze extra performance out of memory without changing everything around the memory as well... bus speeds, chipsets, processor speeds, timings, etc. This will likely take a while before DDR3 actually becomes benficial.

Re:same old story (2, Interesting)

TheRaven64 (641858) | more than 7 years ago | (#20603271)

Intel, when they are prototyping a new CPU, run it in a simulator. This simulates an entire computer, and is very tweakable. A few years ago, they did an experiment; they made every CPU operation take no simulated time. Effectively, this meant that the CPU was infinitely fast. In their standard benchmark suite, they showed a 2-5x performance improvement overall. After doing this, however, increasing the speed of RAM and the disk gave significant improvements.

A given generation of RAM may only make your current system 10% faster, but using the current generation in next years system is likely to stop it reaching anything like its full potential.

The, summary, needs, more, commas. (0)

Anonymous Coward | more than 7 years ago | (#20603209)

I know this is offtopic, but what the heck is going on with people's use of commas these days. I've even been finding myself using them unnecessarily more recently, just because I've been reading so damn many more of them these days than I used to.

DDR (0)

Anonymous Coward | more than 7 years ago | (#20603225)

I still haven't mastered Dance Dance Revolution 2.

Comma comma down doobie doo down down (2, Funny)

Anonymous Coward | more than 7 years ago | (#20603275)

This, is a, great article, and I will, read, it again and, again.

Re:Comma comma down doobie doo down down (4, Funny)

Just Some Guy (3352) | more than 7 years ago | (#20603807)

Who knew, William Shatner, wrote, tech articles?

Any good transitional mobos? (2, Interesting)

InvisblePinkUnicorn (1126837) | more than 7 years ago | (#20603423)

I'm looking for a motherboard that has DDR2 and DDR3 slots, but also a firewire port (and eSATA would be a plus), necessary for video editing. Any takers? I could only find one by Gigabyte on newegg but the reviews are mixed.

I can't keep up (1)

corporatemutantninja (533295) | more than 7 years ago | (#20603445)

I haven't even mastered Dance Dance Revolution #1 yet. There's already a 3?

Question (2, Interesting)

rehtonAesoohC (954490) | more than 7 years ago | (#20603555)

The real question I have is whether or not DDR2 is worth upgrading over DDR1. I have 2 gigabytes of DDR RAM in my computer, and I recently started thinking that upgrading might be a good idea. But would I notice a performance increase by upgrading to DDR2? I don't want to spend $150 on a new motherboard and RAM only to get a marginal speed boost.

Does anyone have any insight?

Re:Question (0)

Anonymous Coward | more than 7 years ago | (#20604113)

If you're just upgrading your mobo + RAM, no, probably not. Especially if you already have 2 gigs of DDR. It's usually only a CPU or a GPU ugrade where you'll notice any improvements. Save your pennies until you can afford an entire refresh.

Re:Question (1)

Kjella (173770) | more than 7 years ago | (#20604259)

If you have a DDR1 era system, chances are the perfomance gain is minimal. Save your money for when it's time to get a new CPU as well.

Re:Question (1)

r3m0t (626466) | more than 7 years ago | (#20604459)

It depends on your current CPU and hard disk. If you have a very old CPU and a slow hard disk, then no. If you have more recent hardware (which seems a bit unlikely on a DDR motherboard) then your RAM may be holding you back.

Re:Question (1)

rehtonAesoohC (954490) | more than 7 years ago | (#20604813)

My video card is a Geforce 7900 GTX (512mb) and my CPU is an AMD Athlon X2 4600+. With that setup, would my RAM be holding me back?

Re:Question (1)

jalefkowit (101585) | more than 7 years ago | (#20604815)

I'm no expert but I wouldn't expect a big performance boost from upgrading from DDR to DDR2. Memory performance in general isn't the bottleneck in a typical desktop system; memory CAPACITY might be, but if you have 2GB already that's not the issue.


If you're looking for an easy speed boost, a new motherboard plus a new CPU would be the way to go; CPU performance has been increasing dramatically lately. Here's a chart from THG [tomshardware.com] that illustrates the progress; even the mid-range Core 2 Duos benchmark at 2-3 times the performance of an Athlon 3000+, and the high-end Core 2s are even faster. That's a much more dramatic performance boost than anything you'd see by upgrading your RAM.

Re:Question (1)

archen (447353) | more than 7 years ago | (#20604925)

Because of latency DDR2 is only faster than DDR if you have a CPU over 2Ghz clock speed. And pretty much all speed boosts are marginal now days. The only way you really notice the difference is aggregated marginal increases. Like CPU + mainboard + hard drive + RAM, etc. Typically you can't see much of a difference in changing out one part anymore.

shall we wait to web 3.0 first ... (1)

porky_pig_jr (129948) | more than 7 years ago | (#20605135)

before we move to DDR3?

No DDR2 yet, let alone DDR3 (1)

BlueParrot (965239) | more than 7 years ago | (#20605343)

I appreciate some users make heavy use of graphics software and/or games etc, but for regular office use I am willing to bet that 90% of people have an absolute overkill of a system. I'm using a 1.6 ghz Pentium 4 with 640 MB of ram ( oblig: it should be enough for everybody ; ) , and currently about 258 of that is used ( when accounting for buffers and cache ) to run my desktop environment and most of the software I ever use. Essentially, I expect that in perhaps 2-3 years time I might actually consider to upgrade it along with the screen, and then I will probably just find someone about to replace their "old" core 2 duo. So in summary I want to praise Xfce for saving me a decent bunch of cash in terms of lower hardware requirements. It may not have the smallest footprint there is, but it is impressive what it manages to do with the low amount of resources it does use.

Re:No DDR2 yet, let alone DDR3 (1)

Nicolay77 (258497) | more than 7 years ago | (#20605855)

I do use graphics software and games.

And I have 2GB of DDR memory, not DDR2. It was the right choice for me a couple of months ago, and it has proven to be a great buy, in a performance/cost benefit point of view.

No Xfce or other weird software here, normal XP SP2 and lots of games.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?