×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Blazing Fast Password Recovery With New ATI Cards

CmdrTaco posted more than 3 years ago | from the feel-safe-yet dept.

Encryption 215

An anonymous reader writes "ElcomSoft accelerates the recovery of Wi-Fi passwords and password-protected iPhone and iPod backups by using ATI video cards. The support of ATI Radeon 5000 series video accelerators allows ElcomSoft to perform password recovery up to 20 times faster compared to Intel top of the line quad-core CPUs, and up to two times faster compared to enterprise-level NVIDIA Tesla solutions. Benchmarks performed by ElcomSoft demonstrate that ATI Radeon HD5970 accelerated password recovery works up to 20 times faster than Core i7-960, Intel's current top of the line CPU unit."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

215 comments

My password is safe (0, Informative)

Anonymous Coward | more than 3 years ago | (#31495760)

Because it's in my pants!

Re:My password is safe (5, Funny)

idontgno (624372) | more than 3 years ago | (#31496204)

Dude, haven't you heard? It's really insecure to use such a short password. And yours is surely the shortest EVAR.

My password. (-1)

Anonymous Coward | more than 3 years ago | (#31495868)

1 - 2 - 3 - 4 - 5

Re:My password. (3, Informative)

FireofEvil (1637185) | more than 3 years ago | (#31495942)

1, 2, 3, 4, 5? That's amazing! I've got the same combination on my luggage!

Re:My password. (2, Funny)

Jazz-Masta (240659) | more than 3 years ago | (#31496038)

Great! now when I go into the bank with my stack of Radeon cards they'll call security.

Re:My password. (2, Funny)

BrokenHalo (565198) | more than 3 years ago | (#31497218)

Great! now when I go into the bank with my stack of Radeon cards they'll call security.

No, you're only doing them a favour by "recovering" their passwords.

Stop with the advertising (4, Interesting)

ShadowRangerRIT (1301549) | more than 3 years ago | (#31495872)

This isn't really about GPUs, it's an advert for ElcomSoft products. The whole summary is in marketing-speak for crying out loud.

Re:Stop with the advertising (3, Informative)

ShadowRangerRIT (1301549) | more than 3 years ago | (#31495916)

And for the curious, TFA is no better. They're calling it a benchmark so they can advertise more effectively, that's all.

Re:Stop with the advertising (0)

Anonymous Coward | more than 3 years ago | (#31496376)

It didn't work though, as I'm an Intel Quade Core/nVidia user and apparently their software is poorly optimised for my system.

Hah. Eat that bad marketing droids!

Re:Stop with the advertising (2, Insightful)

ClosedEyesSeeing (1278938) | more than 3 years ago | (#31496846)

... The whole summary is in marketing-speak for crying out loud.

And for the curious, TFA is no better. They're calling it a benchmark so they can advertise more effectively ...

You must be new here.

Re:Stop with the advertising (2, Interesting)

Sir_Sri (199544) | more than 3 years ago | (#31496674)

And a bit of an and underhanded advert for ATI. 'Password recovery' is an inherently parallel problem that really likes the sort of math gpus do, and not so much the sort CPU's do. The ATI 5000 series are the fastest GPU's available at retail right now, doesn't take a genius to put 2 and 2 together here. Anyone who knows anything about NVIDIA's workstation parts knows they are not radical departures from their current retail chips so saying your new fancy retail part is twice as fast as the workstation version of the other guys last gen part is stating the obvious.

Re:Stop with the advertising (0)

Anonymous Coward | more than 3 years ago | (#31496792)

Anyone who knows anything about NVIDIA's workstation parts knows they are not radical departures from their current retail chips so saying your new fancy retail part is twice as fast as the workstation version of the other guys last gen part is stating the obvious.

I agree with what you're saying but Nvidia's current GPUs are about 2 or 3 generations old.
They did a die shrink but its the same as their previous generation chip. They've been
re-cycling chips with new part numbers while they fix the bump problems.

Re:Stop with the advertising (1)

toastar (573882) | more than 3 years ago | (#31496990)

Anyone who knows anything about NVIDIA's workstation parts knows they are not radical departures from their current retail chips so saying your new fancy retail part is twice as fast as the workstation version of the other guys last gen part is stating the obvious.

I agree with what you're saying but Nvidia's current GPUs are about 2 or 3 generations old.
They did a die shrink but its the same as their previous generation chip. They've been
re-cycling chips with new part numbers while they fix the bump problems.

Doesn't Fermi get released like next week?

Re:Stop with the advertising (1)

drachenstern (160456) | more than 3 years ago | (#31496916)

well, I've not RTFA but if they can get double the performance of a Tesla system using much cheaper (as I recall it's expensive, which isn't saying much ~ I refuse to google if I won't RTFA) video cards isn't that something to talk about?

BAH, now you've got me bothered to RTFA... guess I should go do work instead?

Re:Stop with the advertising (0)

Anonymous Coward | more than 3 years ago | (#31496934)

My Via Nano does this same thing about 27% faster than the fastest Intel part.

So there!

Portrayal (5, Insightful)

Dan East (318230) | more than 3 years ago | (#31495918)

I like the way this is portrayed in a totally positive light, as if a person, upon forgetting the password to their device, is going to go out and buy one of these video cards, install it in a machine capable of supporting it (PSU wattage, bus speed, OS, etc), purchase the proprietary "password breaker" software (sold by the company that authored this "story"), all just to recover their password. I think the typical usage for this type of setup is of a more nefarious sort.

Re:Portrayal (2, Interesting)

mcgrew (92797) | more than 3 years ago | (#31496036)

You remember that Elcomsoft was the company Dmitry Skylarof was (is?) with? He's the guy who got thrown in a US jail for something he did in Russia that was completely legal in Russia.

Re:Portrayal (1, Redundant)

jonbryce (703250) | more than 3 years ago | (#31496140)

No, the US jury found him not guilty.

Re:Portrayal (3, Informative)

russotto (537200) | more than 3 years ago | (#31496572)

No, the US jury found him not guilty.

No, the charges against Sklyarov were dropped and he was released as part of a deal in which Elcomsoft agreed to accept US jurisdiction. The US jury then found Elcomsoft not guilty.

Re:Portrayal (-1, Flamebait)

Lumpy (12016) | more than 3 years ago | (#31496770)

They did not send navy seals into Russia to kidnap him and bring him here. he did the stupid move of coming into the USA. You don't enter a hostile country and not expect to get detained.

Re:Portrayal (1)

mcgrew (92797) | more than 3 years ago | (#31496598)

Yeah, after spending four months in jail. Lot of good it does you to be found not guilty when you're incarcerated anyway.

Re:Portrayal (1)

roman_mir (125474) | more than 3 years ago | (#31496288)

Dmitriy Skliarov is the more correct phonetic spelling. /. still does not accept UTF-8, it's retarded.

Re:Portrayal (0)

Anonymous Coward | more than 3 years ago | (#31496042)

but, we wouldn't be on your lawn.

Re:Portrayal (1)

bernywork (57298) | more than 3 years ago | (#31496072)

Yeah like selling one time password solutions to IT bosses when someone gets ahold of their SAM.....

Re:Portrayal (1, Troll)

wvmarle (1070040) | more than 3 years ago | (#31496152)

It all depends on your point of view.
One man's "password recovery" is another man's "password cracking".
Just like the same person being a "freedom fighter" and "terrorist/insurgent" at the same time.
It all depends on your point of view.

Re:Portrayal (2)

elrous0 (869638) | more than 3 years ago | (#31496632)

I'll take the point of view of 99.999% of people who buy (or more likely pirate) this software, and say that's its primary use will be nefarious.

Re:Portrayal (0, Troll)

Runaway1956 (1322357) | more than 3 years ago | (#31497014)

I don't know if that's entirely fair. I've cracked at least a half dozen passwords, for entirely benign reasons.

The ones I've done for fun, I can't even begin to count, LMAO

Nefarious? Well now - let's get ex-president Bill Clinton in this discussion, so we can obfuscate the meaning of "nefarious", along with "sex" and "is". ;^)

Re:Portrayal (1)

ElectricTurtle (1171201) | more than 3 years ago | (#31496246)

I tend to 'recover my password' for my wireless APs with my little friend the paper clip. You're absolutely right that the more common use of something like this is going to be cracking.

Re:Portrayal (1)

Anarki2004 (1652007) | more than 3 years ago | (#31496384)

the paper clip method is effective, but its a real PITA if you have a bunch of ports forwarded and other obscure network settings. Though, I suppose the paper clip is still cheaper than the system you need for the password cracking.

Re:Portrayal (1)

b4dc0d3r (1268512) | more than 3 years ago | (#31496584)

If you don't have "Save Settings" / "Load Settings" buttons somewhere in the interface, you should upgrade. I don't use them myself, but I know they are there.

Re:Portrayal (1)

ElectricTurtle (1171201) | more than 3 years ago | (#31496610)

If you don't have an AP/router that can export configs or you just don't export those configs, it's your own fault, just like not backing up your hard drive.

Re:Portrayal (1)

Minwee (522556) | more than 3 years ago | (#31496370)

a person [...] is going to go out and buy one of these video cards, install it in a machine capable of supporting it (PSU wattage, bus speed, OS, etc), purchase the proprietary "password breaker" software (sold by the company that authored this "story"), all just to recover their password. I think the typical usage for this type of setup is of a more nefarious sort.

I think you're right. Someone could use this kind of setup to play Crysis.

Re:Portrayal (1)

Yvanhoe (564877) | more than 3 years ago | (#31496794)

Yeah, they may be people who just moved in and who want to have *gasp* internet access ! The vicious bastardly devils ! The thieves of non-thievable property ! the... the... TERRO-PIRATES !

Re:Portrayal (1)

Mister Whirly (964219) | more than 3 years ago | (#31497110)

Last time I checked, free unfettered internet access still isn't a god given right. If you want a service, you need to pay for it.

Re:Portrayal (1)

Sir_Lewk (967686) | more than 3 years ago | (#31496984)

I like the way crowbar makers advertise their produce in a positive light. As if anyone, upon realizing that they need to pull hundreds of deeply sunken nails, is going to go out to the store and buy a heavy 2 foot crowbar. I think the typical usage for crowbars like this is of a more nefarious sort.

GPUs (4, Interesting)

Thyamine (531612) | more than 3 years ago | (#31495946)

This isn't the first story about how crazy fast GPUs are for crunching. I know very little about that level of hardware, but why aren't we incorporating these types of things into CPUs? Is the coding/assembly so different that it doesn't translate? Do they only do certain kinds of processing really well (it is a GPU after all), so it couldn't handle other more 'mundane' OS needs?

Re:GPUs (2, Informative)

godrik (1287354) | more than 3 years ago | (#31496074)

It is in progress in fact. That was the point of intel 80 cores prototype.

I found funny that with time we keep doing the cycle external processor->co processor->ntergrate in CPU dye -> external processor

Re:GPUs (1)

poetmatt (793785) | more than 3 years ago | (#31496312)

you mean the one that horribly failed and wasn't even close to performing as good as a graphics card?

oh, right.

Re:GPUs (3, Interesting)

ShadowRangerRIT (1301549) | more than 3 years ago | (#31496442)

That's not really the same thing. The Intel 80 core prototype was still a CPU at heart, they just made improvements to communication. GPUs are quite different. GPUs are designed as primarily floating point processors (though newer ones can do low precision integer math with similar efficiency), but more importantly, they are vector processors with virtually no support for conditional statements and optimized for sequential access to memory instead of random access. They're halfway between dedicated circuitry and a general purpose CPU; what they can do, they do *very* well, and they can generalize a little, but tasks they weren't designed for need to be rewritten to accommodate their quirks, and eventually reach a point of diminishing returns. Integrating GPUs into the CPU will allow more programs to use it (and possibly speed processing and enable new scenarios where the CPU and GPU need to communicate frequently), but for run of the mill computing tasks, the relatively inflexible design of GPUs is a problem.

Re:GPUs (1)

godrik (1287354) | more than 3 years ago | (#31496658)

well, It was not really a CPU at the heart. It was more a complex network of heterogeneous computing unit with classical CPUs but also DSPs, vector float processing units...

It was of course a prototype and never reached the amrket, but merging CPU-type and GPU-type on the same chip seems definitely to be in Intel's and AMD's roadmap.

Re:GPUs (3, Informative)

Anonymous Coward | more than 3 years ago | (#31496114)

GPU's are better at doing certain calculations generally, and are very good at parallel processing seeing as graphics can be broken down to be processed parallel very quickly. For this, gpu's have a ton of cores. So in a way processors are indeed starting to follow with multicore systems but it is nowhere near the number GPU's use. High end GPU's now have 480+ processor cores on a card these days, thats a lot more then 4 core intel's ;). But if you had a ton of cores on the processor, each additional one doesn't add too much to actual cpu power as most things must be done linearly, not parallel. Just helps with multitasking really. Which is why a few cores are useful, but overall power of the core is better then having a ton of them. Graphics cards go with a ton of lower speed cores.

Re:GPUs (1)

ShadowRangerRIT (1301549) | more than 3 years ago | (#31496680)

...most things must be done linearly, not parallel.

Or to be a bit more precise: Humans can't think about parallelism well. Certain obvious, discrete tasks can be split up, but having whole threads of execution constantly communicating and touching shared resources overwhelms the capacity of most programmers. You could write a massively multi-threaded program to do a lot of stuff that is currently done linearly, but you'd risk a whole lot of crashes and deadlocks from the inevitable bad code and you wouldn't get the full increase in speed since the thread and synchronization management would eat up a lot of the gain. Because of that, most successful programs that use multiple threads of execution either completely separate the running environments (heavily segregated threads or completely different processes). The ones that don't need that level of separation (video encoding, password cracking, etc.) are often the same programs that can benefit from running on a GPU.

Re:GPUs (2, Interesting)

imgod2u (812837) | more than 3 years ago | (#31496122)

To some level, CPU's have been moving to be more GPU like for a long time. SIMD (SSE, AltiVec, NEON) are GPU features that made their way to CPU's. Ditto for parallel, long pipelines. Remember the Pentium 4? That was a huge step in the GPU direction.

There are two problems with that approach:

1. Code that isn't pure number-crunching doesn't run well on such a compute model.
2. The model is almost entirely memory-starved. GPU's have up to a GB of high-speed, dedicated RAM on the card itself. CPU's have to live with high-density (relatively) slot-loaded memory.

AMD is moving in a direction where the GPU compute parts are fed by the CPU front-end. As we move forward, I suspect we'll see more of a "fusion" if you will (don't sue me) of the two compute models.

Re:GPUs (1)

poetmatt (793785) | more than 3 years ago | (#31496356)

As an aside, GPU's have up to 2GB [slashgear.com] and soon to be 4GB. The rest of what you said it dead on.

You are right though, the concept of gpu/cpu hybrid seems to be a possible end result if the combination can be run successfully. I suspect there is a lot of very tough engineering involved with getting such a concept working.

Re:GPUs (1)

imgod2u (812837) | more than 3 years ago | (#31496660)

The biggest problem will be heat. GPU's currently consume and dissipate upwards of 200W. Likewise for CPU's. To get a single die or even package to consume and dissipate that much power and heat will be a challenge not just for the silicon designers but the system guys as well.

Re:GPUs (1)

TheKidWho (705796) | more than 3 years ago | (#31496130)

Yes GPUs are very different, they are designed to do a lot of very similiar calculations to an extremely large set of vector data. That's also pretty much all they do, they aren't nearly as good for logic like a traditional CPU is.

Re:GPUs (1)

jonbryce (703250) | more than 3 years ago | (#31496176)

The coding / assembly is so different that it doesn't translate, and they only do certain kinds of processing well.

Re:GPUs (0)

Anonymous Coward | more than 3 years ago | (#31496186)

YES.

These cards are fast in certain applications due to the extreme parallelism, this simply doesn't work in the majority of day-to-day computing.

Re:GPUs (1, Informative)

Anonymous Coward | more than 3 years ago | (#31496210)

GPUs are ridiculously parallel SIMD style processors. They are good at performing massive amounts of calculations in parallel, but for it to be effective these calculations have to be the same across all cores. GPUs don't have a huge amount of true CPU-style cores; rather, they can run one or a few algorithms over many instances of data in parallel. This works great for certain scientific and brute-force calculations such as these (and for 3D games), but it doesn't really work for regular programs. Also, GPU programs usually need to be written in a specific programming language (usually a derivative of C) and with this parallelism in mind.

CPUs already have something like this (SIMD instructions), and they help for many workloads, but massive paralellism like this only really works for GPU-type tasks, not your average OS/apps.

Re:GPUs (0)

Anonymous Coward | more than 3 years ago | (#31496810)

Also, even if you tried, you couldn't do something like this on a regular x86 CPU. Most of the non-cache circuitry on an x86 CPU is dedicated to decoding the pathetically bad x86 instruction set and working around its huge limitations in order to attain reasonable performance (i.e. that circuitry does nothing useful). The only way to get decent performance with a reasonable amount of transistors (so you can have many cores) is a properly thought-out architecture (e.g. PowerPC), and to go even smaller / more parallel you usually design your own ISA (e.g. the Cell's SPEs or GPU shader processors).

Re:GPUs (0)

Anonymous Coward | more than 3 years ago | (#31497062)

Ah. Spoken like a true unemployed former PowerPC programmer who is bitter because most of the world is using the x86 instruction set.

Re:GPUs (3, Informative)

John Napkintosh (140126) | more than 3 years ago | (#31496214)

The last sentence nails it. They only do certain types of operations well, and the frequency with which I upgrade GPUs compared to CPUs - or more specifically, the fact that I very rarely replace both at the same time - leads me to believe I'm better off having them separate. Maybe there are parts of the GPU which could be incorporated into the CPU, and I think that might be what the Core i3/5/7 processors are doing with GMA integration.

Re:GPUs (1)

wvmarle (1070040) | more than 3 years ago | (#31496222)

They tend to be specialised processors, designed specifically for graphics related tasks. Those tasks happen to be computationally very similar to other tasks such as protein folding. Though they will be poor performers or possibly totally incapable of certain tasks your CPU has to do.

That said I'm waiting for the first CPU to build in a GPU so we don't even need a separate graphics chip on our motherboards any more to for the already integrated graphics output.

Re:GPUs (1)

TheKidWho (705796) | more than 3 years ago | (#31496496)

What are you waiting for? The new Intel processors already have integrated on-die GPUs. The next generation will have the GPU and the CPU completely integrated.

Re:GPUs (1)

wvmarle (1070040) | more than 3 years ago | (#31496802)

You see how well I follow current hardware :)

When buying a computer these days I go for the cheapest/slowest specced hardware which is way more than what I need (watch videos, troll /., e-mail, general browsing, some web/general programming, standard office work).

And actually what I'm waiting for to buy a new box is for the old one to die. 5 year old hardware is still fast enough for pretty much everything that I do.

The CPU speed problem is solved and done with for all but the most demanding applications (since well 8-10 years now).

GPU as well. A little later, when did we get integrated graphics again?

Memory problem is also solved. So cheap now.

The iPhone is already as powerful as a 10-year-old desktop. And it's not that I'm doing anything now that I couldn't do on my computer 10 years ago - at least not hardware wise.

OK so now the GPU is on the CPU die. The memory controller is there already, right? It should be a no-brainer to integrate small stuff like ethernet. Bluetooth/wifi may be a bit harder due to the necessary aerial. Now all that's left is to integrate the RAM on the die and we're there. A one-chip computer. No need for complex motherboards any more. At that moment the whole hardware issue is solved.

Re:GPUs (5, Informative)

SuperMog2002 (702837) | more than 3 years ago | (#31496372)

Is the coding/assembly so different that it doesn't translate? Do they only do certain kinds of processing really well (it is a GPU after all), so it couldn't handle other more 'mundane' OS needs?

Yes, exactly. CPUs are built from the ground up to do scalar math really, really fast. That lends itself well to doing tasks that must be performed in sequence, such as running an individual thread. However, they've only recently gained the ability to do more than one thing at a time (dual core processors), and even now high end CPUs can only do six calculations at once (6 core processors).

Meanwhile, GPUs are built to do vector math really, really fast. They can't do individual adds anywhere near as fast as a CPU can, but they can do dozens of them at the same time.

Which type of processor is best for which job depends entirely on the nature of the math involved and how parallelizable the task is. In the case of 3D graphics, drawing a frame involves tons of vector arithmetic work, which is why your 1 GHz GPU will run circles around your 3 GHz CPU for that task (and is also where the GPU gets its name from). In the case mentioned in the article, password cracking is highly parallelizable: you've gotta run 100 million tests, and the outcome of any one test has zero influence on the other tests, so the more you can run at the same time, the better. By running it on the GPU, each individual test will take a bit longer than running it on the CPU would, but you'll be able to run dozens simultaneously instead of just a few, and will thus get your results much faster.

CPUs certainly have their place, though. Some tasks simply must be done in sequence and cannot be easily divided up in to seperate parallel tasks. The CPU will get these done much faster, since running them on the GPU would incur the speed penalty without realizing any benefit.

I've simplified it a bit for the sake of explanation, but that's the gist of it. Hope that helps!

Re:GPUs (1)

0123456 (636235) | more than 3 years ago | (#31496698)

I know very little about that level of hardware, but why aren't we incorporating these types of things into CPUs?

Because most people don't want their CPU consuming 300W of power when idle?

Re:GPUs (1)

blackchiney (556583) | more than 3 years ago | (#31496888)

It's all about IP. It wouldn't be horribly difficult to put a GPU and CPU on the same die. BUT, Intel doesn't GPU manufacturers getting into the x86 business and GPU makers certainly aren't going to give Intel any of their technology and get cut out the market. Intels attempts at GPUs has been less than spectacular. Good enough for Word and Excel. Not good at modern gaming.

So for the foreseeable future CPUs and GPUs will be treated as seperate entities.

Re:GPUs (1)

Orgasmatron (8103) | more than 3 years ago | (#31496932)

3D rendering involves lots of integer math, and there are huge portions of any given render that do not depend on each other. For example, the scene may involve calculating the vectors from thousands of vertices and faces of polygons towards hundreds of light sources. That is millions of operations that are essentially independent. Another phase of a render will require calculating the intersection of each view vector (and more if you use FSAA) with a polygon in the scene.

So, modern GPUs are a special case of a CPU. They have many cores, each of which has many integer and/or vector units. Their sole purpose in life is to perform those millions of parallel operations as fast as possible. Modern GPUs can perform hundreds or thousands of operations per cycle, at speeds gradually approaching CPU speeds.

Technically, you can run arbitrary code on a GPU, just like your keyboard or SCSI controller is Turning Complete, but they are so optimized for their job that it would be impractical. Unless, of course, you have a massively parallel integer math problem you want to solve, like brute forcing a password...

Re:GPUs (0)

Anonymous Coward | more than 3 years ago | (#31496992)

Cell processors, anyone?

I like the use of the word "Recovery" (1)

wisebabo (638845) | more than 3 years ago | (#31495948)

I think we all know what they really mean. ;)

(Anyway, I'm also impressed by the power shown by the GPUs. Its a good demonstration that some of the new technologies (CULA? CUDA?) that allow "regular" programmers to use this power actually will really speed up some things.)

Slashvertisement (5, Funny)

Anonymous Coward | more than 3 years ago | (#31496002)

Hey Editors,

You forgot a link to the buying page [elcomsoft.com]
For as low as 1.399,- € you can start cracking^Wrecovering passwords today.

Re:Slashvertisement (4, Informative)

cOldhandle (1555485) | more than 3 years ago | (#31496132)

In case anyone wants to play around with this tech without paying (or rolling your own): I tried out this free (as in beer) windows software yesterday: http://golubev.com/rargpu.htm [golubev.com] It seemed to work very effectively - I was able to brute force 5 lower case letter only passwords on RAR files in a couple of minutes on a GTX260. It also has some advanced options to specify mutations of strings to try, and to use word lists.

Re:Slashvertisement (2, Informative)

elrous0 (869638) | more than 3 years ago | (#31496676)

Agreed, looks more like the kind of "story" we'd see posted by kdawson, not Taco.

Out of curiosity... (2, Interesting)

Anonymous Coward | more than 3 years ago | (#31496086)

I keep hearing stories about using GPUs for non-GPU computations, but has anybody here tried it?

What does your screen look like while a program like this is running?

Re:Out of curiosity... (3, Informative)

cbope (130292) | more than 3 years ago | (#31496174)

Normal. Running GP-GPU or CUDA apps has no effect on output to the screen. We do it for medical imaging processing.

Re:Out of curiosity... (1)

Lunix Nutcase (1092239) | more than 3 years ago | (#31496212)

I keep hearing stories about using GPUs for non-GPU computations, but has anybody here tried it?

Yes many people do it and have for years.

What does your screen look like while a program like this is running?

Why do you assume that the screen looks different.

Re:Out of curiosity... (2, Funny)

Anonymous Coward | more than 3 years ago | (#31496298)

Good point. Why would I assume a graphics card operation would have any effect on graphics? I've only ever used mine to take ice off the windshield.

Re:Out of curiosity... (1)

Lunix Nutcase (1092239) | more than 3 years ago | (#31496978)

Except for the fact that the part doing these calculations has nothing to do with the parts of the GPU that are handling outputting to the screen?

Re:Out of curiosity... (1, Informative)

Anonymous Coward | more than 3 years ago | (#31496318)

He probably assumes the screen looks different because he assumes video cards are nothing but raw memory-mapped video framebuffers -- which hasn't been the case since 1990 or so.

Re:Out of curiosity... (1)

Verunks (1000826) | more than 3 years ago | (#31496460)

Why do you assume that the screen looks different.

because when you run a cpu intensive application your pc becomes really slow, so if you use instead your gpu the screen should become "slower" too, but probably you wouldn't even notice

Re:Out of curiosity... (2, Informative)

Anonymous Coward | more than 3 years ago | (#31496756)

The display buffer for a 1920x1200 screen with 24-bit colour takes less than 7MB. Even a fairly low-end graphics card will have at least 128MB of memory. In other words, there's plenty of memory for a program running on a GPU without needing to piss on the display buffer.

If your screen is just displaying a bunch of 2D windows, then the 100s of cores in your GPU will be sitting idle. Again, computations running on the GPU will have no impact on what you see.

Re:Out of curiosity... (2, Informative)

ShadowRangerRIT (1301549) | more than 3 years ago | (#31496830)

I run the Folding@home [stanford.edu] GPU client [stanford.edu] on my GeForce 8800 GTX. On Vista and later OSes (pre-Vista, the driver model wasn't well adapted to GPGPU and this leads to a polling driven communication scheme which is really inefficient), the effect on resources is unnoticeable aside from during games (where I kill the client to reduce jerkiness); the GPGPU work is lower priority and gets shunted aside from rendering, though the latency involved is a problem for graphics intensive games. For less demanding work and general usage, it's unnoticeable; the GPU is perfectly capable of drawing the screen and curing Alzheimer's at the same time. :-)

boo (5, Informative)

Anonymous Coward | more than 3 years ago | (#31496102)

boo slashvertisement

103000 passwords per second. So? (2, Informative)

roman_mir (125474) | more than 3 years ago | (#31496192)

On that one ATI board that get 103K passwords per second and only 4K on the latest quad-core intel (which by the way, is almost 26 and not 20 only times faster.)

So that's wonderful. How many passwords are there in 1024 bit SSL encryption? 1024 asymmetric is equivalent to 80 symmetric algorithm, so that's like 2^80 passwords, right?

Let's say 100,000 passwords per second, that's 10^5.

Google says this: (2^80 / 10^5 ) / (3600 *24 *365*1000) = 383 347 863

383.3 million years to go through every password in 2^80 possibilities.

In reality, of-course, not every combination is used, many passwords can be eliminated by heuristic and also it helps to have a good dictionary file handy, from which to generated most likely password combinations. That probably cuts down from 383 million years to something much more ATI friendly. Of-course we need to use stronger cypher.

As a final note: at last I understand why Hugh Jackman needed the 7 monitor setup, each one must have been used as an output device for the video card it was connected to. Obviously the video cards were the actual power behind all that hacking!

Re:103000 passwords per second. So? (1)

Gothmolly (148874) | more than 3 years ago | (#31496536)

FWIW, I work for $LARGE_US_BANK and we have an 8 character password limit, that MUST be exactly 8 characters, contain a number, etc.etc.etc.

MANY passwords can be eliminated through some social engineering.

Re:103000 passwords per second. So? (1)

L4t3r4lu5 (1216702) | more than 3 years ago | (#31497074)

Well done, sunshine! You've just reduced the number of attempts to break $LARGE_US_BANK passwords to 1370114370683136 (78^8)

At 103000 attempts per seconds, that's... 421 years oh.

(Yes I know it's not going to take until the entire set has been attempted to crack a password.)

Re:103000 passwords per second. So? (1)

maxume (22995) | more than 3 years ago | (#31496656)

So all you have to do to save 190 million years is buy two of them.

Excellent.

Re:103000 passwords per second. So? (1)

roman_mir (125474) | more than 3 years ago | (#31496710)

no, definitely, you can buy 100,000,000 of them.

Then it's 3.83 years only. With a bulk discount will only cost maybe $10 per card, that's only 1000,000,000 a billion? Chump change for any government. Spend 100 times more money, get results in days.

Re:103000 passwords per second. So? (1)

maxume (22995) | more than 3 years ago | (#31496828)

None of the additional units save as much time as the second one.

Hey Guys! (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#31496202)

I can see my penis! I can see my penis! Guess this diet is working.

Unfortunately it's smaller than a two year old's, but it's still larger than kdawson's.

Password Recover is the new Hacking? (1)

olddotter (638430) | more than 3 years ago | (#31496278)

What is with the spin talk here in the title? Basically this just says I need to use better passwords. Speak truths....

High end video cards now tax deductible? (1)

perpenso (1613749) | more than 3 years ago | (#31496400)

So a programmable high end video card can probably be written off on one's taxes as a numerical accelerator? :-)

Finally! (1)

pyrr (1170465) | more than 3 years ago | (#31496436)

Finally...someone who understands!

I wanted to get one of those professional car door jimmy kits (the ones with a jimmy for just about every model of vehicle!) that tow truck supply vendors sell "just in case I get locked out of my car", but they had these outrageous demands that I "prove" that I was a legit tow outfit or garage.

The locksmith supply was much the same way when I tried to buy a lockpick set, "just in case I get locked-out of my house".

You can bet I'll be getting this software. I must've forgotten my password because I can't login to my secure wireless network, and I unfortunately don't have physical access to my WAP in order to reset it the cheap and easy way, with a pen tip.

ObRokicki (1)

michaelmalak (91262) | more than 3 years ago | (#31496574)

From 1986 [aminet.net]:

Executes the cellular automata game of LIFE in the blitter chip. Uses a 318 by 188 display and runs at 19.8 generations per second. Author: Tomas Rokicki

Password Recovery (1)

MrTripps (1306469) | more than 3 years ago | (#31496864)

"Password recovery" is about the same swarthy euphemism as "waste management" or "escort service." Why did an advertisement for hacking passwords get on /.? Aren't their IRC channels for that sort of thing?
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...