×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

105 comments

Overclocking an AMD? (0, Flamebait)

Adolf Hitroll (562418) | more than 6 years ago | (#21389287)

sounds boring...

Re:Overclocking an AMD? (2, Informative)

l_bratch (865693) | more than 6 years ago | (#21389397)

Does he even show any overclocking?

Maybe it was a problem my end, but he just talked about this mythical tool for a while, then just as you really start to get into it - the video ends.

Re:Overclocking an AMD? (1)

Jeff DeMaagd (2015) | more than 6 years ago | (#21389639)

It looks like a lame marketing hype video anyway. I hate the way it's edited, like someone trying to be hip with all sorts of video effects and transitions. And now that you mention it, the story telling is pretty bad too, one error sounds like like leaving the resolution out.

Re:Overclocking an AMD? (1)

clayne (1006589) | more than 6 years ago | (#21391541)

That guy wishes he was Tommi Makinen!

One thing to note: remember the days of before with CPU locks and other various schemes to prevent overclocking by the owner. Now it seems like we've come full circle.

Even Intel is embracing it. [intel.com]

One thing I do continually find funny is the constant disclaimer of "fried" CPUs. Anyone who overclocks a CPU knows it's fairly difficult to actually damage it unless one is doing something incredibly stupid.

More legs! (0, Offtopic)

bvimo (780026) | more than 6 years ago | (#21389301)

Where do I add the extra leg?

Just release it already (1, Offtopic)

PhrostyMcByte (589271) | more than 6 years ago | (#21389325)

My old PC can't handle Crysis and I've been waiting to see the Phenom before upgrading!

Re:Just release it already (0)

Anonymous Coward | more than 6 years ago | (#21389393)

I'm sure they're still working the last of the bugs out. After all, would you want it to have all kinds of bugs like the chips Intel releases? ;)

Anyone watch the video? (0)

Anonymous Coward | more than 6 years ago | (#21389401)

It felt like a poorly edited high school project. I closed my window when I saw the spinning transparent spider swish pan. I mean, seriously. This could be something that would be well-received by the overclocking community. Less fluff, more stuff please!

Overclocking? (1)

arootbeer (808234) | more than 6 years ago | (#21389421)

I was kinda enthused by the fact that he's an active member of the overclocking community, but to stop the video right before it actually starts with the overclocking, and then call it a video about overclocking the platform, seems...specious.

From the first video, the platform looks interesting, but will it be able to do any of those things with just one video card (rather than FOUR)?

Why overclock when you can undervolt? (3, Interesting)

twfry (266215) | more than 6 years ago | (#21389457)

I really don't see where the need to overclock comes from anymore. Today's speeds are pretty darn fast and I'd assume that if you actually have a real need for more processing power, that you should be able to come up with the couple hundred bucks for another socket/proc.

Lately I've been undervolting to build silent systems. The latest AMD Brisbane processors at 2.1GHz can be undervolted to 1.05V and still pass my stress tests at speed, and stay below 40C with the 'silent' fan modes.

Re:Why overclock when you can undervolt? (3, Interesting)

bigstrat2003 (1058574) | more than 6 years ago | (#21389589)

Everyone wants different stuff. I have less than no interest in a quiet (let alone silent) system, but I am interested in a fast system. I never have been an overclocker, but I can easily understand those who are... it's about squeezing every last drop of performance out of that chip. No different from wanting a silent system, really, as in both cases you're in a relative minority who's taking a concept to its extreme, they're just in different areas.

Re:Why overclock when you can undervolt? (1)

Bert64 (520050) | more than 6 years ago | (#21390673)

But how about squeezing every last bit of performance out of the software that runs on those chips?

Incidentally, i want quiet and power efficient on the systems i keep running 24/7 (my laptop, tho its usually suspended, and a media server) at home... Tho for a system that boots up to play games and then gets turned off i'm not so concerned.
In a datacenter i want power efficiency and performance, noise is irrelevant.

Re:Why overclock when you can undervolt? (1)

bigstrat2003 (1058574) | more than 6 years ago | (#21390759)

But how about squeezing every last bit of performance out of the software that runs on those chips?
This is good too, but by and large, that's the programmers' concern, not yours.

Re:Why overclock when you can undervolt? (1)

Bert64 (520050) | more than 6 years ago | (#21391259)

Well, it seems pointless to go to such extremes to get as much performance as possible from your hardware (and potentially shortening its lifespan or voiding its warrantee), only to then waste it running inefficient software. Surely a better idea would be more efficient software coupled with more reliable hardware.

As to your suggestion of the software performance being the programmer's concern, it could similarly be said that hardware performance is the concern of the engineers who designed it.

Re:Why overclock when you can undervolt? (1)

bigstrat2003 (1058574) | more than 6 years ago | (#21391641)

Not true. In general, it's easier to get more performance out of your hardware, especially with something like overclocking, which is akin to changing a software setting, and getting more performance.

Re:Why overclock when you can undervolt? (1)

sznupi (719324) | more than 6 years ago | (#21392615)

I can think of a few programs (won't mention them by name, they're pet-peeves of many users here...) that feel slow now matter how fast the machine is.

OTOH I use few equivalents which feel much faster even on 5 year old machine...and it would seem that recognising good coding and NOT having to upgrade your machine constantly is quite easy...

Re:Why overclock when you can undervolt? (1)

theshowmecanuck (703852) | more than 6 years ago | (#21392491)

This is good too, but by and large, that's the programmers' concern, not yours.


It doesn't seem that way some days. I know that my current PC is orders of magnitude faster with respect to hardware. But it seems that programmers, or programming houses really, are just using the faster hardware to put out less efficient code. I say programming houses because they are the ones that pay the bills and want things done faster at the expense of quality so that they can sell more crap.

I wonder how fast something like Open Office would run if coded with the efficiency needed to run a program on older computers (not necessarily PCs) when they had to pay attention to resources and cycles. That or some of the games I've seen that take forever to load up even with a quite powerful dual core system. But of course it takes time to pay attention to those kinds of details, and time is money.

It doesn't even seem to matter the size of the project when it comes to leaning on the hardware too much. I've seen large enterprise level ordering and billing implementations founder because people... some supposedly very experienced... didn't think hard enough about performance when they should have. About how the system would scale with the number of users or the amount of data handled and/or stored.

Re:Why overclock when you can undervolt? (1)

John_Booty (149925) | more than 6 years ago | (#21392703)

I wonder how fast something like Open Office would run if coded with the efficiency needed to run a program on older computers (not necessarily PCs) when they had to pay attention to resources and cycles.


But today's programs are, in general, orders of magnitude more complex than those delightfully handcrafted versions of AppleWorks and WordPerfect from the 1980s you're feeling nostalgic for.

The effort required to build a piece of software does not scale linearly with the complexity - a piece of software that's twice as complex mostly likely takes something like four times the effort, if everything else (development toolchain, programmer skill) remains constant.

Simply put, you just couldn't build something like OpenOffice the way WordPerfect 5.1 was built.

Most of the software "bloat" isn't programmer laziness so much as it is the result of programmers using higher-level tools to manage and abstract away complexity just to keep a project reasonably manageable.

Re:Why overclock when you can undervolt? (1)

Creepy Crawler (680178) | more than 6 years ago | (#21390843)

I'd humbly argue that making quiet/silent machines are far from the minority.

More and more people are hooking up a computer to their TV, and with that, dvr and playback software on their computers. Now, when they pay 2000$ for the screen and sound, you dont want some box going zzzzzzzzzzzzzzzzzzzzzz on the side. Does the DishNetwork box make nasty whirrrs? Does the DVD player grind when you use it?

I dont think so. In that same light, nobody wants computers that growl like sleeping dinos.

Re:Why overclock when you can undervolt? (1)

Doctor Faustus (127273) | more than 6 years ago | (#21391223)

In that same light, nobody wants computers that growl like sleeping dinos.
I want a loud computer for the bedroom, to cut down on how much my wife complains about my snoring. I wouldn't like something like that as an HTPC, though.

Re:Why overclock when you can undervolt? (2, Interesting)

ceeam (39911) | more than 6 years ago | (#21389593)

Why not? Moderately overclocked CPUs generally don't consume much more power (they're still on idle usually, and when they are not they finish their tasks _faster_), they are not louder, they are not less reliable even. Usually overclocking these days is just reversing "market positioning" and restoring proper, designed CPU speed. And if you ever do video encoding or play CPU-bound sims you can never have enough.

But I like silent systems too. But overclocked ones could be silent as well. The days of PIV and early Athlons are long gone thankfully and modern CPUs are pretty energy-efficient again.

Re:Why overclock when you can undervolt? (1)

EdZep (114198) | more than 6 years ago | (#21389651)

Has AMD always been OC friendly? I remember when Intel was actively discouraging the practice so as not to have sales of more expensive CPUs undercut.

Re:Why overclock when you can undervolt? (1)

ceeam (39911) | more than 6 years ago | (#21389865)

Who cares whether they have "always been" friendly or not? Current lines from both manufacturers are very OC friendly.

Re:Why overclock when you can undervolt? (0)

Anonymous Coward | more than 6 years ago | (#21390649)

I think both Intel & AMD lock the multipliers on most their CPUs. They tend to discourage overclocking (probably because of warranties more than anything else) except for a few overpriced models.

Re:Why overclock when you can undervolt? (1)

clayne (1006589) | more than 6 years ago | (#21391665)

Well no questions here - it's obvious to me that you've done quite a lot of overclocking in your time...

Re:Why overclock when you can undervolt? (1)

JebusIsLord (566856) | more than 6 years ago | (#21392495)

The newer black-edition athlon X2s have unlocked multipliers at very affordable prices (~$130). Other than that, yes - only the highest end models have an unlocked multiplier. I picked up a black-edition at that price, and am running it at a clean 3200 MHz.

Bus-overclocking has gotten so much easier though anyhow, since most chipsets let you do it without altering the PCI bus, for example. That used to be the issue back in the day - peripherals would start to flake out.

4 GPUs!!! = Loud, Hot, Expensive (1)

spineboy (22918) | more than 6 years ago | (#21390631)

I can't imagine the cinematic quality of watching a movie with 4!! GPUs (right next to each other for maximum cooling inefficiency) in my home. Maybe if it was the perfect storm, or a movie about propeller aircraft would it not be noticeable. Don't get me wrong, I MIGHT like to have that system (OK probably would love it), but it seems like an non elegant way to do something.

Re:4 GPUs!!! = Loud, Hot, Expensive (3, Funny)

TeknoHog (164938) | more than 6 years ago | (#21390767)

I can't imagine the cinematic quality of watching a movie with 4!! GPUs

That's a lot of GPUs. 4 factorial factorial is about a mole.

Re:4 GPUs!!! = Loud, Hot, Expensive (0)

Anonymous Coward | more than 6 years ago | (#21391021)

Ha, interesting. Very close to Avogadro's Number indeed.

Re:Why overclock when you can undervolt? (0, Funny)

Anonymous Coward | more than 6 years ago | (#21389629)

I really don't see where the need to overclock comes from anymore.


...you seem to forget Vista...

Re:Why overclock when you can undervolt? (2, Insightful)

Jartan (219704) | more than 6 years ago | (#21389635)

I really don't see where the need to overclock comes from anymore.


You seem to be looking at it from a non gaming perspective. Considering the article is about a gaming system that seems to be a bit off topic as far as viewpoints go.

Re:Why overclock when you can undervolt? (1)

bdjacobson (1094909) | more than 6 years ago | (#21393685)

I really don't see where the need to overclock comes from anymore.


You seem to be looking at it from a non gaming perspective. Considering the article is about a gaming system that seems to be a bit off topic as far as viewpoints go.
Also, I think it is popular among poor people because you can spend $80 on a processor and overclock to a processor worth triple that. See the Intel E2180 for example. For me as a college student this is great.

Re:Why overclock when you can undervolt? (1)

kestasjk (933987) | more than 6 years ago | (#21389867)

Why is it overclock and undervolt and not overvolt and underclock?

Re:Why overclock when you can undervolt? (1)

Steve Hamlin (29353) | more than 6 years ago | (#21391725)

Why is it overclock and undervolt and not overvolt and underclock?

Semantics, and all that. The word choices of the phrases are due to the intended goal of the activity that the phrase was created to describe.

Overclocking is seeking the highest clock speed at which that particular processor can run, in order to maximize CPU capability, with only secondary considerations about the energy / thermal factors.It is not seeking to increase the voltage, although that is the unfortunate side effect. Thus it's not "overvolting".

Undervolting is seeking to lower the voltage demand of the CPU/motherboard, in order to lower the energy / thermal requirements, with only secondary consideration about processing capability.It is not seeking to lower the clock speed in order to decrease the CPU capability, although that is the unfortunate side effect. Thus it's not "underclocking".

Re:Why overclock when you can undervolt? (1)

markass530 (870112) | more than 6 years ago | (#21389983)

Several reasons. I'll reference the AMD 5000+ Black Edition because I just picked one up for my new computer I'm building ($130). It's built on the 65NM process, and overclock's easy to 3.3-3.5 GHZ. I Enjoy overclocking (Hobby) and I enjoy having a fast machine, even if I'm not using. Now I also do a lot of video/audio encoding, ripping so every mhz counts to get those tasks done quicker. So I'm getting an extra (effective) 1.8GHZ or so out of this CPU, and overall a damn fast processor, for 130 bucks.

Re:Why overclock when you can undervolt? (0)

Anonymous Coward | more than 6 years ago | (#21390653)

You actually bought a black edition? Have you seen the price a Q6600? Much better joy for an overclocker...

Re:Why overclock when you can undervolt? (1)

drinkypoo (153816) | more than 6 years ago | (#21389997)

I personally want both. I want a system that will self-overclock automatically and which will thermally throttle itself down, and which when idle will undervolt and underclock to save power, much like using the ondemand governor for linux power management, except at a deeper level. This principle is inherent to many larger, mechanical systems, like your car's engine (provided you have an oxygen sensor.) In cold weather, the air is denser, so you can burn more fuel, and you get more power. In hot weather, the opposite is true. The car handles this automatically, and you get more power in cold weather when the engine is able to give you more power. By the same token, sometimes it's cold in my room, and sometimes it's hot. Why shouldn't my computer run at a given temperature? Heat is the primary enemy of the longevity of electronics, right? By using heat as the ultimate limiting factor, we could wring maximum performance out of our computer, and thereby extend its useful life - and reduce our initial hardware needs as well.

Re:Why overclock when you can undervolt? (0)

Anonymous Coward | more than 6 years ago | (#21390685)

You can do this using cpufreqd [sourceforge.net] and its tools. Obviously it's dependent on the hardware facility being there, and on driver support, but it works for my Athlon64 on nforce2 motherboard.

cpufreqd can trigger events based on temp sensors, system load, process list (including nice levels: overclock when you're running a render job, but don't overclock for seti or folding, say), time of day, etc. It's SMP aware. On my system it can change FSB and multiplier, as well as vcore (can't do vdimm yet I think). You can do this kind of thing: "if load is high, overclock as much as you can, unless temp becomes high, in which case progressively clock and volt down until it stabilises. When idle, clock and volt as low as possible.", which sounds like what you're looking for. I think it might even be able to spin down your discs.

Setting up the rules and getting them reliable I have found to be... awkward. But the capability is there.

Re:Why overclock when you can undervolt? (1)

clayne (1006589) | more than 6 years ago | (#21391795)

Some factual updates though:

A typical O2 sensor is only used during closed loop operation - i.e. part throttle and cruising. During full throttle or heavy load situations the system goes to open loop - which means preset fuel maps.

In addition, an IAT sensor (intake air temp) is also used for biasing the fuel mix based on manifold temperature.

Until we get to the point of using wide-band O2 sensors capable of extremely low latency, we won't be using O2s for anything but emissions and rough grained "tuning."

Re:Why overclock when you can undervolt? (1)

ThreeGigs (239452) | more than 6 years ago | (#21390209)

You overclock because most games can still only use one processor core. Dual cores let you offload everything but the game onto the other core, but you're still only running the game on just one core, and you want that core running as fast as possible. Yes, this is changing and more games can take advantage of multiple CPUs, but that's the exception right now, not the rule. Additionally, even if the game can run on multiple cores and use them all, you'd still overclock memory as far as it'll go.

Re:Why overclock when you can undervolt? (0)

Anonymous Coward | more than 6 years ago | (#21390583)

So you really think 6 hours to compress a 2 hours video is OK? Oh, maybe you're using your computer as a typewriter/web browser.

General Slashdot Trends (0)

Anonymous Coward | more than 6 years ago | (#21390691)

As a lurker for years I've observed several trends in comments here.
In 2000, slashdot posts were anti-Nintendo/pro-Sony, wattage was an afterthought, performance was king.
In 2007, slashdot posts are anti-Sony/pro-Nintendo, vehemently and sometimes threateningly anti-religion (you guys scare me), and people bitch that their CPUs are too fast.

I'm sorry, but I always need a faster CPU, so F you. I'll take power savings when my machine is idle, but not when I need it.

Re:Why overclock when you can undervolt? (1)

clayne (1006589) | more than 6 years ago | (#21391611)

I really don't see where the need to overclock comes from anymore. Today's speeds are pretty darn fast and I'd assume that if you actually have a real need for more processing power, that you should be able to come up with the couple hundred bucks for another socket/proc.
Or uhh, overclock it?

Lately I've been undervolting to build silent systems. The latest AMD Brisbane processors at 2.1GHz can be undervolted to 1.05V and still pass my stress tests at speed, and stay below 40C with the 'silent' fan modes.
Okay, now we see your true motivation for the above.

Anyways - as I'm sure you've experienced - we have the ability to undervolt AND hit peak voltage at the same time (with a base voltage set in BIOS of course).

http://cpu.rightmark.org/products/rmclock.shtml [rightmark.org]

CnQ, SpeedStep, etc.

I always find it amusing the irony of the underclocker fanboys. As if hitting the lowest possible temperature is manifest destiny. You realize that typical AMDs are stable to around 75C right? You also realize that a CPU is a solid state chip and not a car alternator as well?

Yes I realize it can "save money" through lower voltage used, but it's splitting hairs these days, and voltage scaling still wins out.

Re:Why overclock when you can undervolt? (1)

Urza9814 (883915) | more than 6 years ago | (#21391803)

Yea, except you can generally save around $50 by overclocking. That's why I do it. I can pretty much get processors for half the price of what they would be at the speed I'm running 'em. (Yes, I'm using $50 processors. What can I say, I'm on a budget.)

Re:Why overclock when you can undervolt? (1)

Repossessed (1117929) | more than 6 years ago | (#21391827)

I really don't see where the need to overclock comes from anymore. Today's speeds are pretty darn fast and I'd assume that if you actually have a real need for more processing power, that you should be able to come up with the couple hundred bucks for another socket/proc.
Quick price point comparison here, I spent 170 on my processor, (2.33 Ghz) and about 60 (fan, thermal grease, and a bit extra on the case) on it's cooling system, while I don't have it overclocked (the cooling system was mostly for fun), the whole thing should easily overclock to match Intel's fastest Core 2 Duo, (3.0 Ghz).

That processor I would be matching is about 280, so my net gain in terms of cash is pretty minimal (though more fun in my opinion).

However, the processor I have, or that more expensive processor, can also overclock significantly more than anything on the table. The conroe is considered stable at 5 Ghz with water cooling by some people, and I'd be more than happy to take mine up to 3.66 if I needed to, so while you may not get much in terms of not needing to fork over a extra hundred or so in cash, getting something that's not actually on the market is a big deal for some people.

Why overclock? It's cost efficient! (1)

Man in Spandex (775950) | more than 6 years ago | (#21392147)

Why overclock? Why not? That's my question? Pretty damn fast? for who? For a slashdotter who's programming in FORTRAN or for a gamer who needs more clocks?

With today's heatsinks, at most what you'd need is a $40-50 heatsink and your cpu can reach speeds of a processor that costs double or triple the one you have. Most video cards out there can overclock without any modifications to the cooling.

Overclocking is safe too, if you know what you're doing. If your PC starts displaying artifacts on the screen you know you've reached your limit without hurting your hardware. Oh I know, you'll tell me that your reducing the hardware's lifespan! Blablabla, how many of you honestly use the same processor after a few years? I know that as geeks, we have old hardware lying around that can be made into pr0n file servers, but most hardware can already last more than a decade.

Of course, it's never a guarantee in overclocking so don't be surprised if your hardware isn't reaching the speeds of someone else you know, but generally you can get extra boosts out of what you have with very little to add onto, at most a good cpu heatsink.

Choo! Choo! All aboard! (3, Insightful)

skoaldipper (752281) | more than 6 years ago | (#21389471)

The marketing train...

I felt like I just got ran over. Nice job AMD. Actually, the first flashvert was pretty slick with the transformer, and was fairly informative. Honestly, I didn't quite extract much information from the overclocking one, except for it's awailable date.

Forgive me, but it's early Saturday morning here. And in the spirit of todays morning cartoon ritual, while munching on some Lucky charms cereal I fully expected the overclocking advert to finish with...

"Shh! Be vewy vewy qwiet, I'm on my AMD hunting for more raw bits! Eh. Heh! Heh! Heh! Heh!"

Re:Choo! Choo! All aboard! (1)

DigiShaman (671371) | more than 6 years ago | (#21391119)

Some of us are waking up with a hangover you insensitive clod! To hell with your Lucky Charms.

DISCRETE (3, Informative)

Sockatume (732728) | more than 6 years ago | (#21389633)

Discrete = distinct, seperate. Discreet = subtle, low-key. That is all.

Re:DISCRETE (0)

Anonymous Coward | more than 6 years ago | (#21389709)

Only in English ...

Re:DISCRETE (1)

Gigiya (1022729) | more than 6 years ago | (#21389793)

Thanks - I always thought one was a misspelling of whichever one was the correct version of what I thought was only one word.

Re:DISCRETE (5, Funny)

Dr. Cody (554864) | more than 6 years ago | (#21390365)

(to) dis Crete = to insult a Greek isle

Re:DISCRETE (0, Troll)

Liancourt Rocks (867396) | more than 6 years ago | (#21391375)

Crete [...]

Greek isle [...]
... and/or Turkish.


Totally waiting to be modded down by ass-pounding, six-fingered insulars :-)

Re:DISCRETE (0)

Anonymous Coward | more than 6 years ago | (#21394707)

I for one welcome our new comedian overlord.
Did it take you hours to think of that gem, or do these things come to you naturally?

Re:DISCRETE (1)

Anonymous Coward | more than 6 years ago | (#21390813)

separate = discrete
seperate =

What's new? (4, Insightful)

mollymoo (202721) | more than 6 years ago | (#21389851)

Perhaps I'm missing something, but this is noting new at all, is it? I mean, the only "innovation" here is that one company is making the CPU, chipset and graphics card. You know, like Intel have been for years. But AMD make one where the graphics card is targeted at gamers. Whoop-de-fucking-do.

Re:What's new? (3, Interesting)

drinkypoo (153816) | more than 6 years ago | (#21390013)

That IS new, and it IS a big deal. It is a sign! It's a sign that there's enough consumers who want their games to just fucking work on PC without having to worry about what hardware they're going to buy, like a console. Sure, you don't get all the benefits of console gaming, but you don't get all the drawbacks, either. So now AMD is interested in catering to this market - it means that the market [probably] exists, which indicates that the overall gaming market is growing. That's not news to most of us, but it's still a positive sign of the direction in which the market is heading. Personally, I am more interested in integrated systems today because I am no longer chasing the latest and greatest, I just want something cheap that works. My primary system is now a laptop (albeit the most powerful one that was available at the time I purchased it) and I like it that way. I am down to one desktop system and I have drive sleds for it so it can be a variety of testing systems. Everything else is a laptop or some other SFF unit (like my iopener, or my xbox.)

Re:What's new? (1)

Bert64 (520050) | more than 6 years ago | (#21390725)

Intel's videocards have always been very much budget cards... Fine for general office computers but useless for gaming or heavy video related work.

Re:What's new? (5, Informative)

moosesocks (264553) | more than 6 years ago | (#21390981)

I mean, the only "innovation" here is that one company is making the CPU, chipset and graphics card. You know, like Intel have been for years. But AMD make one where the graphics card is targeted at gamers. Whoop-de-fucking-do.


Not quite. The role of the GPU is stepping up to be much more important than "just games".

Newer operating systems rely extensively on the GPU to render the desktop, apply various effects to it, etc.... These tasks can be as simple as alpha blending, or as complex as providing a hardware-accelerated version of Photoshop.

It's not quite there yet on Windows (Vista implements it rather poorly), but Linux and OS X have been using OpenGL acceleration on the desktop for quite some time now. In what might be a first for a 'desktop' feature, support for it on Linux is actually quite good, and provides a rather nice UI experience (once you turn all of Compiz's superfluous effects off, that is).

I'm going to jump in here as a part-time Apple fanboy, and also point out that Apple's very heavily pushing its set of accelerated 2D Graphics libraries [arstechnica.com] toward developers to integrate into their applications to provide a more natural and fluid experience. In 10.5, OpenGL rendering is pervasive in almost every part of the user interface. Once you've got that framework in place, it becomes very easy to do all sorts of fun stuff without worrying about bogging down the CPU.

Even fast modern CPUs perform miserably when it comes to graphics operations, as they're not designed to cope with vector and matrix operations. With high-resolution displays becoming prevalent these days, it makes a good deal of sense to offload as much of the processing as possible to the GPU. If you implement this properly in the operating system, it's even transparent to the users AND developers. It's very much a no-brainer.

Many GPUs these days also provide accelerated support for video encoding/decoding, which is also a rather strenuous task for a normal desktop CPU to handle efficiently. Video editing applications can also take advantage by providing realtime previews of HD video rendered with effects applied to it.

Anyone who's done a substantial amount of video editing knows just how welcome this would be. Ironically, it's a shift back to an older paradigm, as the Amiga Video Toasters included an array of specialized graphics hardware to do all of the dirty work, and did it in real-time.

This might also translate into some sort of energy savings, given that modern CPUs consume very little power when idle, although this is pure speculation on my part.

There are all sorts of fun applications for this sort of technology once the frameworks are in place. Read up on Apple's 'Core' set of libraries for a fascinating peek into the future of UI and software design. Pixelmator [pixelmator.com] is one of the first applications to take extensive advantage of these features, and is an absolute joy to work with. Although its featureset isn't as extensive as Photoshop, it's damn impressive for a 1.0 product, and I'd daresay that it's a hell of a lot more useful to mainstream audiences than the GIMP is, and has a sexy UI to boot. Dragging the sliders when tweaking a filter, and watching the ENTIRE image smoothly change as you drag the slider seems like nirvana to photographers and graphic artists (even on somewhat old hardware)

So yes. This is a big deal. Everyday desktop software is transitioning toward relying upon the GPU for basic tasks, and AMD has stepped up to the plate to provide a decent set of entry-level graphics hardware to fill in the gap. Remember the state of video hardware before nVidia came along, and introduced the TNT2 and later the Geforce2-MX? Before them, decent 3d graphics hardware was an extravagant luxury. Afterward, it was easily affordable, and nearly ubiquitous.

I should also point out that Intel's graphics hardware is absolute shit. That comparison's just not fair.

Re:What's new? (1)

Penguin Follower (576525) | more than 6 years ago | (#21391233)

I'm going to give you a great big "THANK YOU" for that link to Pixelmator. I did not know about this program. I'm really impressed with the low price, too. With so many programs well over $100, at $59 I'm going to have to give it a try, and buy it if I do like it.

Re:What's new? (1)

moosesocks (264553) | more than 6 years ago | (#21391815)

No problem! It's shareware, so do give it a try before plunking down $60 for it. It IS of course missing some of the features you'd expect in Photoshop, although it's got more of the 'essentials' than the GIMP presently does. It's also

There seem to be a few inexpensive graphics apps coming onto OS X, rushing to fill in the gap, given that there weren't really many options apart from the GIMP and Photoshop (one's rather undesirable, and the other's rather expensive and outdated).

Pixelmator leads the pack, but there are one or two other raster and vector apps that look promising, and some competition will certainly be welcome.

I can't wait to see similar independent/inexpensive page layout and video-editing tools come forward.

Also, the GIMP team needs to take a long hard look at Pixelmator, consider the fact that it took a very small team about a year to develop, have themselves a good cry, and then bring their darn program up to a usable state.

Re:What's new? (0)

Anonymous Coward | more than 6 years ago | (#21391457)

Are you saying the voodoo2 and voodoo3 weren't affordable pieces of equipment? Not to mention some of their other cards. Dude, way to re-write history.

Re:What's new? (1)

Bottlemaster (449635) | more than 6 years ago | (#21391679)

Even fast modern CPUs perform miserably when it comes to graphics operations, as they're not designed to cope with vector and matrix operations.
Actually, they are designed for it. Modern CPUs have SIMD instruction set extensions (MMX, 3DNow!, SSE, etc) that allow the CPU to perform quite efficiently vector and matrix operations. In fact, on the Sega Dreamcast, the GPU didn't even do vertex transformation. All transformations were done on the CPU before the polygons were sent off to the GPU. The Dreamcast's SH-4 processor can multiply a vector by a 4x4 matrix in less than 40 cycles. A modern general purpose processor performs miserably when it comes to several graphics operations, but vector and matrix math is not the bottleneck.

Re:What's new? (0)

Anonymous Coward | more than 6 years ago | (#21393473)

I'll have to disagree with you entirely here. Yes the role of the GPU is increasingly important, but no AMD provided zero insight on this. Ask any gamer, there was nothing insightful here. AMD is launching a platform so you can buy all your important parts from one vendor... whoopity do.

Guess what, no matter what vendor parts I use, they all interact magnificently. AMD isn't going to change that, hell they are dreaming if they think they are going to make anything cheaper unless they are going to shove packaged deals down my throat.

They mentioned scalable, they lied. What they mean to say is, if I want to waste more of my money, I can buy more cards at launch and sli/crossfire them. By the time I might actually need that additional power, I won't be able to get that card, just look at the 7900gt nivida.

Any company stupid enough to use the word teraflop has lost my respect. This was just a show of marketing jumbo and that AMD has no clue about their gaming population as consumers.

Re:What's new? (1)

mollymoo (202721) | more than 6 years ago | (#21393603)

Sorry, but Spider is not an entry-level system designed to pep up your accelerated desktop experience, it's intended to give you more fps in Crysis. The graphics cards which are part of Spider are ATI's fastest cards; they are gaming cards through and through and cost over $200. So while accelerated desktops are the future, AMD Spider isn't targeted at that future.

AMD & 64-bit CAD/CAM/OpenGL (1)

mosel-saar-ruwer (732341) | more than 6 years ago | (#21391029)


I mean, the only "innovation" here is that one company is making the CPU, chipset and graphics card. You know, like Intel have been for years. But AMD make one where the graphics card is targeted at gamers. Whoop-de-fucking-do.

Soon ATI/AMD will be releasing a new high-end GPU series, called Stream [sci-tech-today.com], as a competitor to nVidia's Quadro FX series.

Traditionally, ATI supported only 24-bit floating point numbers on their consumer-grade GPU's [whereas nVidia & Matrox supported 32-bits on their consumer-grade GPU's], but Stream will support 64-bit floating point numbers, which, in combination with the AMD hypertransport bus, has the potential to produce a signal-processing workstation which might very well find itself on the DOD "Banned-For-Export" list.

Re:What's new? (1)

ameoba (173803) | more than 6 years ago | (#21392195)

The bit about overclocking is significant because there was a big stink recently about how the Phenom CPUs were going to be launching at fairly low clock speeds. If the overclocking is well supported, it really changes what shipping at low clock speeds means.

I don't see the value (2, Insightful)

bl8n8r (649187) | more than 6 years ago | (#21390049)

Sucking up mass jigawatts of power off the grid to juice 4 video cards for gaming is insane. The target groups for this rig are people with compensation problems or ones with no concept or care for energy conservation. We're moving in the wrong direction folks.

Re:I don't see the value (3, Insightful)

slyn (1111419) | more than 6 years ago | (#21390239)

I don't think its a problem you'll need to worry about anytime soon. According to this [steampowered.com], only 0.41% of about 165000 Steam users (when I just checked) have 2 GPU's. The number is probably way smaller for 3 card users, and probably barely anyone has a 4 card setup. The performance just doesn't scale well enough in SLI/Crossfire for it to be worth it to buy two GPU's. IIRC the performance increase in framerate is only around 30% if you are using two of the same model of GPU. It's just not cost effective enough for the masses to want to spend on these.

Re:I don't see the value (1)

steveaustin1971 (1094329) | more than 6 years ago | (#21390755)

Using Steam for info on who is using SLI is not very accurate, Steam is for source game freaks, and alot of them play them because they DON'T have good systems... Counter Strike being the most popular. Source is not really very hard on the graphics card, but since they released COD 4 that is built with SLI in mind, Tiger direct keeps selling out of SLI bundles, and as for the 30% framerate increase, thats the LOW end, its actually 30% to 75% increases. I have a machine running two 7300GT cards on an AMD 3200+ that gets 80FPS in Battlefield 2, so its cheap AND fast, also have a Pentium D with two 8800's that will run Crysis at 50-60fps with the graphics cranked. The market for SLI is not just expensive Quad core high end cards, its another way to build fast cheap gaming systems by linking two older cards together as well. $400 Canadian will get you an SLI barebones that will run current game titles with ease. Also some of us just get excited about building these systems and seeing how far we can push them, I honestly have more fun putting machines together and running benchmarks than I do playing some of the games...

Re:I don't see the value (2, Insightful)

Wrath0fb0b (302444) | more than 6 years ago | (#21390327)

Sitting at home with any amount of computing power has to be more energy efficient than taking a car anywhere.

I think this counts as insightful (4, Insightful)

Kupfernigk (1190345) | more than 6 years ago | (#21390629)

In order to get to and from the office in a small European city car, with about the same real world consumption as a Prius, I use enough fuel to produce about 6KWH of electricity, enough to run a 4-GPU 2-screen rig for a morning (including the monitors). That is on the very low side for commutes; the guy who commutes from the next large city in his SUV uses as much fuel in a day as I do in two weeks. If one of the ultimate goals of these systems is virtual working in a photo realistic environment, they could be big enough to need a substantial water cooling system and still reduce global warming.

Re:I think this counts as insightful (1)

Shark (78448) | more than 6 years ago | (#21391753)

It isn't called global warming anymore. The new official name for it is 'climat change'. The UN would be in quite a pickle justifying carbon taxes if it turns out that we aren't warming in a few decades. But you can never argue that the climate isn't changing. I say they're just taking the safe bet regardless of which side of that 'consensus' (god forbid there be a debate) you are.

That said, I am *totally* for energy efficiency if merely from the fact that waste is bad. But if they really want to save the earth, they should focus on things like deforestation and overfishing. Especially since the best way to soak up all that C02 is to have lots trees around.

I think this counts as crayon-realistic. (0)

Anonymous Coward | more than 6 years ago | (#21392165)

"If one of the ultimate goals of these systems is virtual working in a photo realistic environment, they could be big enough to need a substantial water cooling system and still reduce global warming."

I doubt it is the goal. We're still quite a ways from a photo-realistic VR environment even on the high end let along the commodity level. Plus no one has proven that photo-realism is even necessary for the telecommuting experience.

Re:I don't see the value (1, Informative)

Anonymous Coward | more than 6 years ago | (#21390627)

Since when is forward the wrong direction?
What's wrong with having 4 graphics cards? Especially in this case ones that _aren't_ heavy on the noise or wattage side. 4 cards could be used for graphics, or some combination of graphics and physics, or just heavy "general purpose" compute power (where I use the term "general purpose" as loosely as can be applied to a graphics card...make no mistake that the kinds of apps that a GPU can accelerate are rather specialized).

But you guys always wanted beowulf! (0)

Anonymous Coward | more than 6 years ago | (#21390867)

This is slashdot. Home of the beowulf posts.
After all these years whining about "imagining a beowulf cluster of these", you FINALLY HAVE YOUR WISH!
And now you bitch that it's the "wrong direction"?

Make up your minds.

Re:I don't see the value (1)

Kjella (173770) | more than 6 years ago | (#21391065)

Oh give me a break, while the cards suck up a little power you're not polluting of significance, not more than our most efficient power plants anyway. You're not contributing to the throw-away society, you're not littering or use a product that's made of animals, used for animal testing, endangering any species or much of anything else. I doubt it's any worse than buying a bigger car, bigger house, imported foods or a helluva lot of normal social activities. If you ever drove down to McDonalds in your SUV, you probably damaged the environment far more than a few watts.

Sure, it probably wouldn't be a very good idea if everyone around the world burned 500W for home entertainment every night. But what I'm saying is that of the people that could afford a 2000$ rig today, many pollute far more with that money. At least when I ask myself what I'd use those extra money for, I fall in that category myself...

Where's Imageon... (1)

LEX LETHAL (859141) | more than 6 years ago | (#21390103)

I'm desperately waiting for the ATI Imageon imbedded next-gen smartphones and pocket pcs. I'm overclocking my Blackjack, and when I get dedicated graphics, things will be no different.

Re:Where's Imageon... (1)

fitten (521191) | more than 6 years ago | (#21392205)

Imageon, apply directly to your smartphone.
Imageon, apply directly to your smartphone.
Imageon, apply directly to your smartphone.

The bux are heavy in my pocket (0)

Anonymous Coward | more than 6 years ago | (#21390195)

Looks great. Where can I buy one of these Phenoms? Oh wait... I can't get one that's even as fast as good ol' Q6600. Intel it is then.

Slashvertisement (0)

Anonymous Coward | more than 6 years ago | (#21390465)

What, those billions from Abu Dhabi not enough for AMD to pay for some regular advertising? They have to resort to astroturfing like this?

This is just marketing plush (1)

bocaJWho (1080217) | more than 6 years ago | (#21391181)

If AMD really wants to show that they're serious about letting the overclocking community have there way, why don't they just unlock the clock multiplier on the CPU? I remember that way back with the original Athlon, you could accomplish this with just a mechanical pencil and be well on your way to melting your CPU (with the plus side of not having to change your frontside bus, thus keeping other system components like chipset and memory fairly happy as well). The problem was, of course, that there higher-end chip sales really were being undercut by this, so they worked on making the process a lot harder. If AMD were serious about overclocking, they would let you change the multiplier, but really they're serious about making a slick-looking system so that kids can feel like they're really rad while they play with the system settings.

Re:This is just marketing plush (0)

Anonymous Coward | more than 6 years ago | (#21391991)

I guess you haven't herd of the Athlon X2 black edition, which features unlocked multipliers, and over clocking doesn't void your warranty with this version. Still, we'll see how faithful AMD is about overclocking when their quad cores become more established in the consumer market.

A finnish geek running Vista? (0)

Anonymous Coward | more than 6 years ago | (#21391905)

There goes the credibility.

Of course it's Vista (0)

Anonymous Coward | more than 6 years ago | (#21392021)

Who else would have a motivation to overclock a quad core already running at 2.2GHz?

For GPGPU, that would be great. (0)

Anonymous Coward | more than 6 years ago | (#21392683)

I think the AMD Spider platform would rock if I was going to make a GPGPU cluster (Onboard low latency 10gbE would make it truly rock!)

If you want to have an uber gaming rig, I would overclock a quad core penryn, and SLI of a high end nvidia card, because nvidia is doing AA better than ATI right now.

Re:For GPGPU, that would be great. (1)

GeeZee (1190439) | more than 6 years ago | (#21393131)

I don't know what planet you are from but ATI has always had better image quality than Nvidia. Take a look at their 24x FSAA for the processing cost of 6x FSAA when running crossfire. On top of that you can enable Temporal AA, Alternate pixel centers, Adaptive AA, ATM alpha fix & sharpen mode, EATM, RNPP, & so much more. Load up ATI tray tools and see all the options most ppl will never use. I'm also guessing you haven't owned an ATI card in quite a while. FYI PhenomFX will be able to do 3.4ghz on air, 3.6 on water easy.....With every core. Odds are you will have at least 1 core that will smoke the others and OC even further. And your Uber gaming rig with the NVIDIA card????? Check what cards are on top of the 3Dmark05 & 06....It's the 2900XT crossfire OC on water.

Re:For GPGPU, that would be great. (1)

The_GURU_Stud (955937) | more than 6 years ago | (#21393605)

Calm down on the ATI front there. The initial chips probably will be lucky to do 3.2 (until the process becomes more mature obviously). Most people will take an 8800 over the ATI any day. Not disagreeing with most of the IQ things, but the 8800 is a slam dunk on price/great perf. And anyone worth the balls in their sack doesn't use a synthetic benchmark as an effective argument for actual perf. Go ahead and load up a game on those top O/Ced vid cards. At a decent resolution (not the crappy defauly 1280x1024), the nvidia card is going to pull away the higher the res goes.

Priorities ? (1)

udippel (562132) | more than 6 years ago | (#21394451)

I really don't get it. Having been a regular AMD user since 1989, I see AMD's market share gained during the last years slipping, its lead position overtaken, its finances floundering.
Of course, Spider has the potential to win the hard-core gamers and overclockers (and maybe the energy-conscious underclockers). But - I didn't do the research here, wild guessing - per one hard-core gamer 10 or 100 CPUs are sold to the general public (desktop). And 10 or 100 CPUs are sold to be used in servers.
In order to survive, AMD needs numbers, not a dedicated hard-core user group.

Having preferred ATI to NVIDIA for years as well, now I use (and recommend) NVIDIA. Because NVIDIA has finally reached a reasonable coverage of its video drivers for Linux, FreeBSD and Solaris. Plus, their drivers install and configure okay.

I am afraid, AMD has got the priorities not correct, except of in the field of low power consumption (and even there the 35 W EE is moving to a 45 W basis).
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...