Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Overclocked Radeon Card Breaks 1 GHz

ScuttleMonkey posted more than 8 years ago | from the stable-overclocking-unstable-overclockers dept.

Graphics 199

dacaldar writes "According to Yahoo Finance, noted Finnish over-clockers Sampsa Kurri and Ville Suvanto have made world history by over-clocking a graphics processor to engine clock levels above 1 GHz. The record was set on the recently-announced Radeon® X1800 XT graphics processor from ATI Technologies Inc."

Sorry! There are no comments related to the filter you selected.

Huzzah! (5, Funny)

Anonymous Coward | more than 8 years ago | (#13883998)

What a day for world history! It will be remembered forever!

Re:Huzzah! (3, Funny)

Spy der Mann (805235) | more than 8 years ago | (#13884159)

A small clock cycle for Radeon, a giant step for mankind?

Awesome! (1, Funny)

Anonymous Coward | more than 8 years ago | (#13883999)

First Duke Nukem Forever post
First Imagine a Beowulf Cluster of those post
First Can Netcraft confirm that? post

Re:Awesome! (2, Funny)

Anonymous Coward | more than 8 years ago | (#13884249)


Hmm. Your ideas are intriguing to me and I wish to subscribe to your
newsletter.

Re:Awesome! (1)

jzeejunk (878194) | more than 8 years ago | (#13884322)

but ... Can it run linux? :p

Re:Awesome! (1)

someone300 (891284) | more than 8 years ago | (#13884383)

Or, more importantly, will it run on linux?

More markup = more nesting (0)

Anonymous Coward | more than 8 years ago | (#13884395)

First tired joke about first tired joke post post.

You too are a meme!

One wonders... (5, Interesting)

kko (472548) | more than 8 years ago | (#13884000)

why this announcement would come out on Yahoo! Finance

Re:One wonders... (2, Funny)

JamesTRexx (675890) | more than 8 years ago | (#13884035)

Looking at the article I'd say to give ATI shares a boost.

Re:One wonders... (1)

grub (11606) | more than 8 years ago | (#13884041)


Look at the bottom of the page. It's just a press release announcing that ATI was the first to get to 1 GHz. Basically a "fuck you" to Nvidia, nothing more.

Re:One wonders... (5, Insightful)

Jarnis (266190) | more than 8 years ago | (#13884056)

... because ATI made a big press release about it.

Since their product is still mostly vapor (you can't buy it yet), and nVidia is currently owning them in the high end market because ATI's product is so late, one has to grasp straws in order to try look l33t in the eyes of the potential purchasers.

Wish they'd spend less time yapping and more time actually putting product on the shelves.

Nice overclock in any case, but ATI putting out a press release about it is kinda silly

Re:One wonders... (0)

Anonymous Coward | more than 8 years ago | (#13884435)

Wow, it's vapor? Sure must have been REALLY hard to make a cooling system for that then. Oh wait, what's this, they're selling it on newegg? http://www.newegg.com/Product/Product.asp?Item=N82 E16814102610 [newegg.com] .

Wake up you mindless troll.

No big surprise (4, Funny)

Z0mb1eman (629653) | more than 8 years ago | (#13884001)

I didn't have Slashdot in a full screen window, so the headline read:

Overclocked Radeon Card Breaks
1 GHz

Was wondering why an overclocked card breaking is such a big deal :p

7ghz (0)

Anonymous Coward | more than 8 years ago | (#13884003)

They should have clocked it to 7.1 GHz with liquid nitrogen...

n00b (2, Funny)

Anonymous Coward | more than 8 years ago | (#13884034)

i clocked mine to 12 ghz with a combination of liquid magnesium and wd40.

Jeez... (0)

Anonymous Coward | more than 8 years ago | (#13884190)

...is there anything one can't do with that stuff?



(Judging from Big boys [big-boys.com] , I doubt it.)

Benchmarks? (5, Funny)

fishybell (516991) | more than 8 years ago | (#13884009)

Without the pretty graphs how will I know what's going on?!

Re:Benchmarks? (1)

b0r1s (170449) | more than 8 years ago | (#13884038)

Well, it probably ran really fast for a few minutes, then broke.

Video drivers, as notoriously buggy and fragile as they are, don't handle clock speed changes very well at all. You can get a few percent without much problem, but the difficulty scale starts climbing much faster than CPU overclocking.

Congrats, though - it's only a matter of time until it happens in production chips.

Re:Benchmarks? (1)

KingOfGod (884633) | more than 8 years ago | (#13884196)

One thing I'd like to know is - How does overclocking my graphics card, improve my (graphic) gaming experience?

Re:Benchmarks? (1)

netkid91 (915818) | more than 8 years ago | (#13884286)

It's all about the FPS. The more frames DirectX and OpenGL can put out in a second, the smoother the picture renders. A standard NTSC T.V. signal is 29.7 FPS, while the average GFX card these days can pull off 6000(not that I have ever had such a high framerate, must be my 600MhZ celery CPU and PCI(not express) Xtasy branded ATI Radeon 9200 card)

GPU to excel CPU (1)

hadj (926126) | more than 8 years ago | (#13884013)

The performance of GPU's seem to grow faster than those of CPU's. I remember someone had proposed to use GPU's to proces generic data. It would be 12 times faster than a CPU.

Re:GPU to excel CPU (2, Interesting)

TEMM (731243) | more than 8 years ago | (#13884061)

GPU's are great at working on linear algebra problems, which are basically what graphics are. For general purpose computing however, they would not be that much faster than a CPU

Re:GPU to excel CPU (3, Interesting)

steveo777 (183629) | more than 8 years ago | (#13884101)

I recall reading an article on /. a long time ago involving a group of coders from MIT or something like that who pitted a P4 CPU against an ATI or nVidia GPU that was running at about a third the clock speed with a tenth of the memory. They were, of course, running mostly linear equations and the CPU got it's pants kicked in by just under an order of magnitude IIRC.

What I've been waiting for is some sort of mathematics program (I used to use Mathematica in college) that could utilize this concentrated power, rather than hampering the CPU.

Does anyone know if any researchers have gone as far as to utilize a GPU for anything like this?

Re:GPU to excel CPU (3, Informative)

LLuthor (909583) | more than 8 years ago | (#13884156)

Re:GPU to excel CPU (1)

Isca (550291) | more than 8 years ago | (#13884197)

The more interesting processor to do this with would be this:
http://www.ageia.com/technology.html [ageia.com]
Graphics card do certain types of mathematical equations very well. The physics PPU specializes in more complex processing.

Re:GPU to excel CPU (1)

Saiyine (689367) | more than 8 years ago | (#13884142)


It would be 12 times faster than a CPU.

No. The G in GPU stands for Graphics, not Generic :P. They are x times faster than CPUs doing what they are designed to do, matrix computations, so the trick would be to use matrix-related algorithms to accomplish the same work you would do with a CPU.

More info at the BrookGPU web [stanford.edu] .

Re:GPU to excel CPU (2, Informative)

OzPeter (195038) | more than 8 years ago | (#13884171)

You mean like these people are doing?

Generic GPU programming [gpgpu.org]

Re:GPU to excel CPU (4, Informative)

Jerry Coffin (824726) | more than 8 years ago | (#13884188)

The performance of GPU's seem to grow faster than those of CPU's. I remember someone had proposed to use GPU's to proces generic data. It would be 12 times faster than a CPU.

Go here [gpgpu.org] for several examples of this -- far from simply having been proposed, it's been done a fair number of times.

The thing to keep in mind with this is that while the GPU has a lot of bandwidth and throughput, most of that is due to a high degree of parallelism. Obviously 1 GHz hasn't been a major milestone for CPUs for quite a while, but CPUs are only recently starting to do multi-core processing, while GPUs have been doing fairly seriously parallel processing for quite a while.

Along with that, the GPU has a major advantage for some tasks in having hardware support for some relatively complex operations that require a fair amount of programming on the CPU (e.g. multiplying, inverting, etc., small vectors, typically has a single instruction to find Euclidean distance between two 3D points, etc.)

That means the GPU can be quite a bit faster for some things, but it's a long ways from a panacea -- you can get spectacular results applying a single mathematical transformation to a large matrix, but if you have a process that's mostly serial in nature, it'll probably be substantially slower than on the CPU.

Along with that, development for the GPU is generally somewhat difficult compared to development on the CPU. Writing the code itself isn't too bad, as there are decent IDEs (e.g ATI's RenderMonkey) but you're working in a strange (though somewhat C-like) language. Much worse is essentially a complete lack of debugging support. Along with that, you have to take the target GPU into account in the code (to some extent). I just got a call in the middle of a meeting this morning from one of my co-workers, pointing out that some of my code works perfectly on my own machine, but not at all on any his. I haven't had a chance to figure out what's wrong yet, but I'm betting it stems from the difference in graphics controllers (my machine has an nVidia board but his has Intel "Extreme" (ly slow) graphics).

--
The universe is a figment of its own imagination.

Speed play offs. (5, Funny)

neologee (532218) | more than 8 years ago | (#13884019)

I always knew ati would finnish first.

Hockey play offs. (1)

heauxmeaux (869966) | more than 8 years ago | (#13884040)

Shut up.

Re:Speed play offs. (1)

eosp (885380) | more than 8 years ago | (#13884415)

Feed it salmiakki...that will make anything run faster. On a side note, onko totta, että suomalaisessa jouluperinteessä joulupukki oli lapsia syövä villisika?

A bit presumptious? (4, Insightful)

syphax (189065) | more than 8 years ago | (#13884029)

have made world history

I think that's going a bit far. Good for them and everything, but world history? V-E day, Einstein's 1905, Rosa Parks refusing to give up her seat on the bus- these events impact world history (sorry for the all-Western examples); making a chip oscillate faster than an arbitrary threshold does not.

Re:A bit presumptious? (5, Funny)

bcattwoo (737354) | more than 8 years ago | (#13884097)

What are you talking about? I'm sure I will have next October 26 off to celebrate Overclocked Radeon Broke 1GHz Barrier Day. Heck, this may even become Overclocked GPU Awareness Week.

Re:A bit presumptious? (1)

Pulzar (81031) | more than 8 years ago | (#13884219)

What are you talking about? I'm sure I will have next October 26 off to celebrate Overclocked Radeon Broke 1GHz Barrier Day. Heck, this may even become Overclocked GPU Awareness Week.

I'll be the first one to ask my company to give everybody a day off to celebrate such a day. I'll celebrate anything, really, as long as I get my day off.

rosa parks (0, Offtopic)

drewxhawaii (922388) | more than 8 years ago | (#13884306)

not really sure why she gets all the credit for that. she wasn't the first, and far from the only...

Cause (1)

carguy84 (897052) | more than 8 years ago | (#13884377)

OutKast wrote a song about her.

3D Mark? (1)

SirDrinksAlot (226001) | more than 8 years ago | (#13884032)

I'm not going to take that seriously until I see some actual 3DMark results. I can overclock my 9800Pro to some insane speeds but once I start to push it I get all kinds of corruption.

Re:3D Mark? (1)

Ironsides (739422) | more than 8 years ago | (#13884410)

Your wish is granted
http://www.muropaketti.com/3dmark/r520/12419.png [muropaketti.com]

Re:3D Mark? (0)

Anonymous Coward | more than 8 years ago | (#13884455)

Doesn't seem very impressive considering the cpu is very overclocked also. I have heard of NVIDIA's 7800GTX doing 10k with mild overclocking, certainly not liquid nitrogen that's for sure.

Global Warming (1, Redundant)

koick (770435) | more than 8 years ago | (#13884037)

AHA! There's the proverbial smoking gun for global warming!

The culprit (4, Funny)

ChrisF79 (829953) | more than 8 years ago | (#13884043)

I think we've found the source of global warming.

Re:The culprit (0)

Anonymous Coward | more than 8 years ago | (#13884221)

[stands up]

I broke the dam!

Re:The culprit (1)

Idontpostmuch (918650) | more than 8 years ago | (#13884339)

I broke the dam!

We didn't listen!

Re:The culprit (0)

Anonymous Coward | more than 8 years ago | (#13884452)

I broke the dam!

Re:The culprit (0)

Anonymous Coward | more than 8 years ago | (#13884352)

[stands up]

I broke the dam!

Re:The culprit (1)

Buzz_Litebeer (539463) | more than 8 years ago | (#13884370)

Even though this is funny, there is a trilogy called the "nights dawn" trilogy which noted that the earth could overheat just from the waste heat given off by all of the appliances and energy being pumped into the atmosphere every day.

World history? (2, Insightful)

Seanasy (21730) | more than 8 years ago | (#13884053)

... have made world history...

Uh, it's cool and all but not likely to be in the history books. (easy on that hyperbole, wiil ya)

Re:World history? (2, Funny)

FidelCatsro (861135) | more than 8 years ago | (#13884126)

I am getting fed up with all the hyperbole around here ,they are worse than Hitler

(*Think about it*^)

Re:World history? (1)

ichigo 2.0 (900288) | more than 8 years ago | (#13884437)

But it's Hyperbole day!

Someday people will ask... (4, Funny)

Keith Mickunas (460655) | more than 8 years ago | (#13884054)

where were you when the first video card was overclocked to 1GHz. And most people will respond "huh?".

Seriously, "world history"? There's no historical significance here. It was inevitable, and no big deal.

Historical? (1, Insightful)

Anonymous Coward | more than 8 years ago | (#13884060)

How is this any more historical than overclocking it to 993 mhz? Its not! 1ghz is just a nice round number. It I overclock one to 1.82 ghz tomorrow, no one will care!

Re:Historical? (1, Funny)

Anonymous Coward | more than 8 years ago | (#13884164)

Yeah, but if you can hit 2.0 GHz...

We'll just see (5, Funny)

crottsma (859162) | more than 8 years ago | (#13884064)

NVidia will make a competitve model, with blackjack, and hookers.

Re:We'll just see (1)

MmmmAqua (613624) | more than 8 years ago | (#13884106)

In fact, forget the blackjack! And the competetive model!

Re:We'll just see (1)

Jerry Coffin (824726) | more than 8 years ago | (#13884277)

NVidia will make a competitve model, with blackjack, and hookers.

According to most of the reviews, nVidia's competitive model is the 6800GT -- which is now a generation out of date.

--
The universe is a figment of its own imagination.

Re:We'll just see (0)

Anonymous Coward | more than 8 years ago | (#13884314)

Thanks Bender... See you at the moon casino! LOL

Hookers? I'll buy it! (0)

Anonymous Coward | more than 8 years ago | (#13884326)

nt

What's the point of these tests? (3, Insightful)

pclminion (145572) | more than 8 years ago | (#13884067)

If you cool a chip, you can make it run faster. This is a matter of physics that doesn't need to be tested any more than it already has been. In some small way I appreciate the geek factor but I'm far more interested in geek projects that have some practical use.

And as for being the first people in the world to do this... the chances of that are small. I'm sure there are people at Radeon (and other companies) who have done things far more bizarre, but didn't announce it to the world.

Re:What's the point of these tests? (1)

spencerogden (49254) | more than 8 years ago | (#13884124)

Yes, but a lot of different chips have different overclocking potential. It's interesting to see which can be pushed the furthest, even if its impractical. Beside, since when are geeky pursuits practical?

Re:What's the point of these tests? (2, Insightful)

pclminion (145572) | more than 8 years ago | (#13884173)

Yes, but a lot of different chips have different overclocking potential. It's interesting to see which can be pushed the furthest, even if its impractical.

Really, I don't think it's interesting whatsoever. It's like testing the strength of various bulletproof glass samples at a temperature of -100 C. The fact is, bulletproof glass is not used in such environments so the test gives no useful information.

Beside, since when are geeky pursuits practical?

I can't believe you're being serious. My geeky pursuits pay for my house.

Re:What's the point of these tests? (1)

LesPaul75 (571752) | more than 8 years ago | (#13884208)

This is a matter of physics that doesn't need to be tested any more than it already has been.
Yeah, but it's also well known that if you force more air/gasoline/nitrous into a car engine, it will go faster. But people continue to try to break land speed records. It's human nature. People do it just for the sake of doing it.
I'm sure there are people at Radeon (and other companies) who have done things far more bizarre, but didn't announce it to the world.
The company is ATI, and no, they don't do anything like this. They don't care what happens to their chips at -80C or +180C. All they do is test them a little bit beyond the limits of their recommended operating range. If you operate a chip way outside that range, they aren't going to guarantee that it will work.

Re:What's the point of these tests? (1)

pclminion (145572) | more than 8 years ago | (#13884388)

The company is ATI

Yeah I goofed, sorry.

They don't care what happens to their chips at -80C or +180C. All they do is test them a little bit beyond the limits of their recommended operating range.

I highly doubt it. I have some pretty intimate knowledge of what sorts of things go on at a certain giant chipmaker, and believe me, all kinds of crazy shit has been done that you don't know about simply because they don't tell you. I imagine ATI is similar. I don't know if they've done anything exactly like this, but the experiments certainly reach into this realm of the bizarre.

The difference is, they have specific reasons for conducting those tests other than "Whooo, look what I did!"

Re:What's the point of these tests? (1)

slackmaster2000 (820067) | more than 8 years ago | (#13884213)

Agreed. Yes it's pretty neat, but I'm not sure if it tells us anything useful.

They had the GPU cooled to -80C. Heat is a *huge* limiting factor in what a processor is capable of. So taking heat out of the equation and then being amazed that a GPU reached 1Ghz is a tad silly.

What would be cool is to see some gaming benchmarks on the GPU running that fast as a sort of look into the future. Although it's not really looking into the future since there is a lot more to a next generation GPU than just clock speeds. Actually, you can just imagine it with about the same results... so maybe it's not so cool, scratch that :)

Anyhow, I'm sure they had a lot of fun doing it, so good for them!

Re:What's the point of these tests? (0)

Anonymous Coward | more than 8 years ago | (#13884229)

Dude, you're not going to get a 386 to run at 5GHz even if you cool it to absolute zero. There's a limit to how far you can go with that "if you cool a chip, you can make it run faster" line.

Not for the weak (5, Insightful)

Mr. Sketch (111112) | more than 8 years ago | (#13884071)

The team, optimistic that higher speeds could ultimately be achieved with the Radeon X1800 XT, attained the record speeds using a custom-built liquid nitrogen cooling system that cooled the graphics processor to minus-80 degrees Celsius.

It seems we may have a ways to go before it can be done with standard air cooling. I actually didn't think that operating temperatures for these processors went down to -80C.

Why so cold? (1)

elgatozorbas (783538) | more than 8 years ago | (#13884362)

Good point! Why must they be so cold? I always thought that the problem was removing heat (i.e. not letting it get hot) rather than making it ridiculously cold. Anyone?

Re:Why so cold? (1)

gardyloo (512791) | more than 8 years ago | (#13884442)

If you can come up with a clever way of removing heat from a chip quickly without making its surroundings much cooler, I'm sure we'd all be happy to hear it.

Re:Why so cold? (0)

Anonymous Coward | more than 8 years ago | (#13884471)

I always thought that the problem was removing heat (i.e. not letting it get hot) rather than making it ridiculously cold.

Well for one thing, being submerged in liquid nitrogen is a pretty damn good way to remove heat :-P

Transistors are not instantaneous devices -- it takes time for them to switch between "on" and "off". This time decreases as the transistor gets colder. But that's not the main reason to cool the chip.

The other reason is that shorter clock pulses have less energy in them than longer ones. The wires within the chip have resistance which continually saps this energy even further. On chips there is a concept called "fan-out" which basically means how many different places the same signal is being sent. This divides the energy still further. If the clock pulses get too fast, the fan out and resistance combine to chop the pulses up so small that they are no longer able to reliably trigger the transistors, and the chip just stops working.

Keeping the chip extremely cold reduces the wire resistance and lets you get away with shorter clock pulses.

comon now (5, Funny)

Silicon Mike (611992) | more than 8 years ago | (#13884084)

If I could only go back in time and add liquid nitrogen to my 8088 processor. I know I could have gotten it up to 5.33 mhz, no problem. NetHack benchmarks would have been off the chart.

GPU vs. CPU Speed (1, Interesting)

Anonymous Coward | more than 8 years ago | (#13884087)

I've always wondered...Why have GPU speeds always been so much slower than CPU speeds?

Are they made on a different process? Are they made with different materials? Are there signifigantly more transistors on a GPU?

Why don't we have a 3Ghz GPU?

Re:GPU vs. CPU Speed (3, Informative)

freidog (706941) | more than 8 years ago | (#13884241)

Since DirectX 8 (I think), the color values have been floating point numbers, this is to avoid loosing a lot of possible values through all blending with multi-texturing and effects (fog, lighting ect) which are of course much slower than very simple integer calculations. Even on the Athlon64's FP add and muls are 4 cycles, you'd have to make the top end A64 about 700mhz if you make them single cycle execution. (multi-cycle instructions aren't as bad a thing on the CPU as there are plenty of other things to do while you wait, not so in GPUs).

GPUs have also tended to focus on parallel execution - at least over the last few years - increasing the number of pixels done at the same time, to compensate for not being able to hit multi-ghz speeds, so yes they have many more transistors than typical CPUs (the 7800GTX might break 300 million, well over 250 million) - and of course heat is an issue if you push the voltage and / or clock speeds to far. The last few generations of GPUs have been up around 65-80W real world draw, more than most CPUs out there. And of course GPUs have very little room for cooling in those expansion slots.

Re:GPU vs. CPU Speed (5, Insightful)

xouumalperxe (815707) | more than 8 years ago | (#13884426)

Well, while the CPU people are finally doing dual core processors (essentially, two instruction pipelines in one die, plus cache et al), the GPU people have something like 24 pipelines in a single graphics chip. Why is it that the CPU people have such lame parallelism?

To answer both questions. Graphics are trivial to parallelize. You know to start with that you'll be doing essentially the same code for all pixels, and each pixel is essentially independent from its neighbours. So doing one or twenty at the same time is mostly the same, and since all you need is to make sure the whole screen is rendered, each pipeline just needs to grab the next unhandled pixel. No syncronization difficulties, no nothing. Since pixel pipelines don't stop each other doing syncing, you effectively have a 24 GHz processor in this beast.
On the other hand, you have an Athlon 64 X2 4800+ (damn, that's a needlessly big, numbery name). It has two cores, each running at 2.4 GHz (2.4 * 2 = 4.8, hence the name, I believe). However, for safe use of two processors for general computing purposes, lots of timing trouble has to be handled. Even if you do have those two processors, a lot of time has to be spent making sure they're coherent, and the effective performance is well below twice that of a single processor at twice the clock speed.

So, if raising the speed is easier than adding another core, and gives enough performance benefits to justify it, without the added programming complexity and errors (there was at least one privilege elevation exploit in linux that involved race conditions in kernel calls, IIRC), why go multiple processor earlier than needed? Of course, for some easily parallelized problems, people have been using multiprocessing for quite a while, and actually doing two things at the same time is also a possibility, but not quite as directly useful as in the graphics card scenario.

And you thought slashdot was bad (1)

Edunikki (677354) | more than 8 years ago | (#13884090)

After the LCD screen "news" earlier, I am glad to see that the unashamed marketting is submitted to Yahoo on this one ;).

It was 2D mode only (5, Interesting)

anttik (689060) | more than 8 years ago | (#13884098)

Sampsa Kurri told in a Finnish forum that it was over 1 GHz only in 2D mode. They are trying to run it with same clocks later. ATI left some tiny details away from their press release... ;P

Re:It was 2D mode only (4, Funny)

jandrese (485) | more than 8 years ago | (#13884161)

It also apparently crashed a lot. This is kind of like saying "I got a Volkswagon Beetle up to 200kph[1]!!!" with a whole lot of modifications.

[1] Going downhill

Re:It was 2D mode only (2, Interesting)

slackmaster2000 (820067) | more than 8 years ago | (#13884258)

Are you sure? TFA does say "Noted Finnish over-clockers Sampsa Kurri and Ville Suvanto achieved graphics engine clocks of 1.003 GHz and a memory speed of 1.881 GHz (940.50 MHz DDR (dual data-rate) memory clocks) with maximum system stability and no visual artifacts."

The phrase "maximum system stability" though might be misleading. If you define it as just POSTing, then man I've done some awesome overclocking myself! :)

Interesting that these overclockers are "noted", and "Finnish." That does sort of give them a little mystique, no? Anytime I hear about a Noted [country name] [scientist|engineer] I always think of an older guy in a white lab coat in some top secret super science facility working on amazing advances in science, like overclocking consumer video cards.

Wow, watch the paint dry too. (1)

NerdBuster (831349) | more than 8 years ago | (#13884099)

That's about how exciting this post is. What a dull day for news.

Cheating (1)

AC-x (735297) | more than 8 years ago | (#13884107)

Liquid Nitrogen cooling is a bit of a cheat, didn't Toms Hardware get over 5ghz out of a 3ghz P4 a few years ago by constantly pouring liquid N over it?

I'm gonna wait until you can get 1ghz with a practical cooling solution before getting too excited (tho the way CPUs are heading these days cryogenic cooling may come as stock in a few years!)

Quake IV (1, Funny)

Anonymous Coward | more than 8 years ago | (#13884122)

And it still can't play Quake IV

Re:Quake IV (1)

YodaToad (164273) | more than 8 years ago | (#13884151)

Actually, if you want to make a jab at not being able to run something you'd have better luck with F.E.A.R. Quake IV actually runs nicely on systems that couldn't even begin to run F.E.A.R.

This record already broken (3, Informative)

Ezku (806454) | more than 8 years ago | (#13884123)

Sampsa and Ville already broke their own record by overclocking the same setup to over 1GHz for both the GPU and memory. See pictures over at Muropaketti [muropaketti.com] .

Hot Hot Hot! (2, Funny)

tradjik (862898) | more than 8 years ago | (#13884130)

Now, just pair that up with Intel's new dual-core Xeons and watch your power meter spin! At least you won't have to worry about the increase in gas prices this winter, you can just run your PC to heat your home.

FPS (5, Funny)

koick (770435) | more than 8 years ago | (#13884131)

Cuz you know it's like way better to play Quake IV at 953 Frames Per Second. Totally!

Re:FPS (1)

dogmatixpsych (786818) | more than 8 years ago | (#13884304)

Yep, now we can play the crappy eye-candy games at twice the speed and end the pain twice as soon for only 4 times the cost.

New record! Doom 3 in 5 minutes.

That's sad... (5, Funny)

Xshare (762241) | more than 8 years ago | (#13884136)

That's just sad... that video card now has more clockspeed and more memory than my own main computer.

Re:That's sad... (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#13884185)

You're right. Your machine is sad.

Re:That's sad... (1)

mcrbids (148650) | more than 8 years ago | (#13884398)

That's just sad... that video card now has more clockspeed and more memory than my own main computer.

Curious thought - the human brain has about 50% of its mass devoted to processing sight and patterns in images... Sounds about right, doesn't it?

Needs comparatively less cooling... (0)

Anonymous Coward | more than 8 years ago | (#13884162)

...than a Pentium IV.

But I do love having a PIV now that winter is here.

Because it's worth zilch without... (4, Informative)

GrAfFiT (802657) | more than 8 years ago | (#13884259)

...the pictures of the rig : here they are [muropaketti.com] , 3DMark05 included [muropaketti.com] .

benchmarks not the right ones (0)

Anonymous Coward | more than 8 years ago | (#13884361)

The 3dmark you link to is not at 1ghz, so those cant be the right benchmarks

Overclocked Keith Curtis Breaks Your Face! (1)

Keith Curtis (923118) | more than 8 years ago | (#13884276)

Take THAT Irish!

O RLY? (3, Informative)

irc.goatse.cx troll (593289) | more than 8 years ago | (#13884317)

http://www.bfgtech.com/7800GTX_256_WC.html [bfgtech.com]

BFG GeForce(TM) 7800 GTX OC(TM) with Water Block. Factory overclocked to 490MHz / 1300MHz (vs. 400MHz / 1000MHz standard), this built-to-order card will feature a water block instead of a GPU fan for those wanting to purchase or who may already have an existing liquid-cooled PC system. BFG will hand-build your card using Arctic Silver 5 Premium Thermal Compound. Easily hooked up to any existing 1/4" tubing system or to 3/8" tubes with the included adapters, this card runs cool and silent. BFG Tech is proud to offer their true lifetime warranty on this graphics card. (Card with water block requires internal or external water cooled system, sold separately.)

you know when you need a new pc (0)

Anonymous Coward | more than 8 years ago | (#13884341)


when new video cards are faster than your CPU

It'll just about... (5, Funny)

David Horn (772985) | more than 8 years ago | (#13884385)

It'll just about be able to handle Windows Vista... :-)

Well (0)

Anonymous Coward | more than 8 years ago | (#13884387)

At least these guys are ready for Vista!!

Computer rice (0)

Anonymous Coward | more than 8 years ago | (#13884403)

Overclockers and their ilk remind me of ricers in the automotive world -- both use a lame "turn it to 11" approach instead of real ingenuity and creativity in increasing performance. And whether the engine blows up from excessive boost and nitrous, or the CPU/GPU burns up from heat, they risk equipment damage by toying with technology they don't understand. What's even worse is, they take pride in doing it.

Congrats (2, Funny)

camzmac (889291) | more than 8 years ago | (#13884450)

Dear enthuiastic customer(s),

Congratulations, you have succeeded in clocking our cards past 1000 MHz. You are now in the ranks of Albert Einstein, Mary Curie, and Shakespeare. Many of our employees believe you are now cooler than santa claus.

Your friend,
ATI
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?