Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Grid Computing Saves Cancer Researchers Decades

samzenpus posted more than 6 years ago | from the begin-the-beowulf-cluster-comments dept.

Supercomputing 149

Stony Stevenson writes "Canadian researchers have promised to squeeze "decades" of cancer research into just two years by harnessing the power of a global PC grid. The scientists are the first from Canada to use IBM's World Community Grid network of PCs and laptops with the power equivalent to one of the globe's top five fastest supercomputers. The team will use the grid to analyze the results of experiments on proteins using data collected by scientists at the Hauptman-Woodward Medical Research Institute in Buffalo, New York. The researchers estimate that this analysis would take conventional computer systems 162 years to complete."

cancel ×

149 comments

Oh great ... (5, Funny)

trolltalk.com (1108067) | more than 6 years ago | (#21276103)

Wanna bet they discover that maple syrup or Canadian back bacon cures cancer?

Re:Oh great ... (-1, Troll)

Nimey (114278) | more than 6 years ago | (#21276567)

No, the latest discovery is that your mom's vaginal secretions cure cancer.

Re:Oh great ... (1)

butterwise (862336) | more than 6 years ago | (#21276621)

No, but they may find a link between causes of death and weight [slashdot.org] .

Re:Oh great ... (1)

Silver Gryphon (928672) | more than 6 years ago | (#21278455)

Mmmmm... poutine...

I used to run Folding@... (5, Insightful)

kcbanner (929309) | more than 6 years ago | (#21276109)

...as a competition with friends. But then I realized that I didn't really need to use my computers as heaters...and did a number for the planet and closed the client.

Re:I used to run Folding@... (4, Informative)

Turn-X Alphonse (789240) | more than 6 years ago | (#21276133)

If you run it on a low level you can only increase your usage by about 1-2 and still help the project, there is no logical reason to run the client at 100% if it's going to cost you a bomb, where as at 1-2% you won't win any contests, but you will be helping the project and paying at most a buck or two extra on electric a month.

Re:I used to run Folding@... (1)

UncleTogie (1004853) | more than 6 years ago | (#21276307)

Personally, around our 3 PCs in a smoke-laden environment, I've only seen a {mobo-measured} temp increase of at most 4-5 degrees C {and usually only 2-3 degrees, on systems ranging from a PIII with XP Pro to a Athlon XP 2000+ dual-booting Ubuntu/XP Pro...}

BOINC seems to run a wee bit hotter on Ubuntu, but I've not benchmarked the two clients yet. I'm just guessing more efficient code allows for more ops per cycle meaning more CPU use and thermal waste, but that's all it is: a guess. Anyone else have any insight/numbers on this?

Re:I used to run Folding@... (0)

Anonymous Coward | more than 6 years ago | (#21277399)

It probably has more to do with the increased power consumption involved, and that the variable power generating plants tend to be ones that produced CO2, like coal-fired thermal plants.

Re:I used to run Folding@... (1, Interesting)

stratjakt (596332) | more than 6 years ago | (#21276197)

I'm not yet one of the climate change true believers.

It just feels too much like an economic scheme based in pseudo-science and half truths. "The world is doomed, here's how you as a consumer can spend your way to salvation! Buy a new car and light bulbs filled with mercury!"

It's poster child, Al Gore, uses the word "if" too much. It's an old debating trick, to say "if X, then Y", and focus on the terrible consequence Y, and completely avoid the debate - which is over the validity/scope/level/definition of X.

At the end of the day, and back on topic, I know cancer is absolutely real, and I also know that real cures and treatments are buried in the mountains upon mountains of data in hospitals, schools and research centers.

So, I'm maybe gonna pick a horse I think has a chance of crossing the finish line.

Re:I used to run Folding@... (2, Insightful)

ChatHuant (801522) | more than 6 years ago | (#21276349)

... Al Gore, uses the word "if" too much. It's an old debating trick, to say "if X, then Y", and focus on the terrible consequence Y, and completely avoid the debate - which is over the validity/scope/level/definition of X

I don't see it as a trick, but rather as being honest. Many of the "X" items aren't certain; it would be a lie to present them as such. But we can estimate the probability of X (based on the current state of knowledge), and explore the consequences if X *does* occur. Gore's argument is that the consequences are serious enough to require action now, even it X may not happen after all. Most climate change skeptics I've seen ignore that and focus on the fact that the Xs aren't 100% surely proven.

Re:I used to run Folding@... (2, Insightful)

Belial6 (794905) | more than 6 years ago | (#21276581)

Gore is an idiot. The real "Inconvenient Truth" is that following Gore's advice will kill you within minutes. I'm not convinced that mass suicide is really the right answer.

Re:I used to run Folding@... (1)

modecx (130548) | more than 6 years ago | (#21277517)

Gore is an idiot. The real "Inconvenient Truth" is that following Gore's advice will kill you within minutes. I'm not convinced that mass suicide is really the right answer.

Yeah, but if we did that, we'd really fuck the CO2 levels of the Earth, you know, about 2 days to a week later, anyway.

Re:I used to run Folding@... (1)

Petersson (636253) | more than 6 years ago | (#21278891)


In fact, you can even reduce your carbon emissions to zero.

Al Gore


At first, Al Gore should reduce his own personal CO2 emissions to zero, e.g. to stop breathing.

Re:I used to run Folding@... (1, Interesting)

stratjakt (596332) | more than 6 years ago | (#21276605)

What's ridiculous about the debate is the supposed "corrective actions" are a step backwards if you really analyze them.

Don't buy a Prius, it may get better mileage - though if you convert to gallons per mile - a true meter of energy cost, it doesn't look so good. Never mind the fact it runs on laptop batteries, which makes it a disposable vehicle at the end of the day.

Or go to a "super efficient" diesel engine. Well, there's reasons we restrict the numbers of diesels that can be put on the road, and that's those huge plumes of black smoke that puff out are full of nasty shit. Remember acid rain? You may not be old enough - it was the source of our ecological doom in the 80s, and is the byproduct of sulphur dioxide. Despite the industries claims of super-low-sulphur diesels, the amount in the fuel is not insignificant.

The fact is, modern engines are quite well designed, and with regular maintenance could easily last your entire life. So what does the auto industry do now? Appeal to vanity, sure - you don't want to be seen in the same old rig, do you? They've now found a way to appeal to your innate sense of guilt. Buy a Prius, save this baby seal from clubbing!

I also see the government mandating switches to compact flouros, being absurd. The quality of light is terrible, they dont handle slight power fluctuations all that well, even the ones that are "dimmable", really aren't, and don't work in the cold. We don't want all that mercury in our groundwater when those things start hitting the dumpster en masse. Besides, switching would just mean people leave the lights on longer. If you're used to paying 150 bucks a month, you'll keep paying it, just a weird sociological quirk., in the vein of "some snack has half the calories, so fatty eats 3, and thinks he's dieting".

All this over CO2, a benign non-toxic gas, which has a small contribution to "the greenhouse effect", although it's been shown that the real major cause is water vapor in the atmosphere - but we have no way to track that.

All the while we search for imaginary mystery energy sources - we're almost there, just need a huge breakthrough in physics/chemistry/biology/astrology - we completely ignore the fact that nuclear fission can supply our energy needs, no need to dig up and burn dinosaurs.

But what of the radioactive waste? Well holy fuck, we can stand around masturbating waiting for a magical breakthrough bacteria that turns garbage into gold-plated hydrogen for our fuel-cell car of tomorrow, but this great global scientific community cant figure out how to throw out some gunk? How about this, I'll figure it out for them.. Pulverize it and release it into the atmosphere, it'd be less radioactive material than comes out of a typical coal plant in a year.

My point is, there are sensible, practical, answers - but they aren't all futuristic and neat and don't involve funky lightbulbs and throwaway cars, and would (gasp) preserve the american lifestyle that it's become so vogue for the left to hate, despite living the exact same way.

All I see are ivory tower assholes using this current round of paranoia to line their pockets. I distrust any solution that involves me "buying new things".

So, lets cure cancer.

Didnt we call Acid Rain a myth? (0, Offtopic)

killmofasta (460565) | more than 6 years ago | (#21278949)

Check it out: Google search for Acid Rain MYTH:

http://www.fortfreedom.org/n15.htm [fortfreedom.org]

"THE CONTINUING MYTHOLOGY ABOUT ACID RAIN (8/31/1989)"

If Acid rain was called a Myth in 1989, and its an accept fact now. ( Shuddap you no holocost guys.)
It tool 28 years to be accepted as on general principal to be true.

It took at least 5 years for the existance of HIV/AIDS to impact the screening of blood.

What do you think the acceptance rate for global warming will be?

How long before we ban the flying of airplanes in the stratosphere?

Your call

Re:I used to run Folding@... (1)

bhima (46039) | more than 6 years ago | (#21279179)

I own a modern turbo diesel and it does not emit "huge plumes of black smoke that puff out are full of nasty shit". Thus I know that your statement is false.

This calls into question the rest of your statements which are probably false and certainly histrionic.

Re:I used to run Folding@... (0, Insightful)

Anonymous Coward | more than 6 years ago | (#21276725)

Just curious, what kind of evidence would you like to see that you would call "good enough?"

Science, by it's nature, can't be definitive 'til the actual event occures. Lots of "planes" never left the earth or crashed on takeoff before the Write brothers hit the scene. We were only 100% certain we could make anything fly when we saw a plane actually flying.

So let's say you're trying to see if you're on a course that will cause something bad to happen. You want to know because you'd like to change course to avoid it if it's going to happen. You WILL NOT be 100% certain you are on that negative course until(unless) you see the bad thing happen. Then it will be too late to avoid it. In fact, if you do successfuly avoid the bad thing, there's a really good chance you won't ever be able to tell if it was because of your actions, some other action, or just bad initial predictions.

You still want to make a very good educated guess, because the bad thing is very bad. Knowing that you will NEVER have "proof" that the bad think will occurre, what level of evidence will you accept before commiting to difficult, expensive actions to avoid the bad thing.

To tell the truth, I'm not particularly interested in whether you believe human-caused global warming is in effect. What I'm interested in is what sceptics are looking for in regard to evidence. When someone says the evidence isn't good enough, that implies that some kind of evidence would be. Can you tell me what that would look like?

Re:I used to run Folding@... (1)

ppanon (16583) | more than 6 years ago | (#21277499)

Yep. It almost makes you wish that nobody had done anything about 2-digit dates so that January 1,2000 could have been a serious problem. That way you wouldn't have revisionist people denigrating the efforts put in to avoid the Y2K problem as a waste of resources. I think a lot more people would be willing to appreciate the potential risks of Global Warming if, among other glitches, their company's payroll systems had made it hard for them to get a paycheck in the first few months of 2000.

Yeah, the "Write brothers"... insightful. Sure. (0)

Anonymous Coward | more than 6 years ago | (#21278071)

The burden of proof of a claim is on the person making the claim.

Parent post was not insightful and was modded up for political reasons. Sad.

Re:Yeah, the "Write brothers"... insightful. Sure. (1)

Stooshie (993666) | more than 6 years ago | (#21279797)

Did you read or understand your parent post at all?

Re:I used to run Folding@... (1)

EsbenMoseHansen (731150) | more than 6 years ago | (#21279449)

I'm not yet one of the climate change true believers.
[...] Buy a new car and light bulbs filled with mercury!"

To be fair, the saved electricity means less mercury emission from coal plants, so the mercury, at least, is of little concern.

Or, you could look into LED.

Re:I used to run Folding@... (1)

statemachine (840641) | more than 6 years ago | (#21276269)

If you run a server that needs to be available 24/7 but still has idle time, f@h will have a minimal footprint and a lot of potential benefit.

Re:I used to run Folding@... (1)

jhol13 (1087781) | more than 6 years ago | (#21277887)

Can I run it so that speedstep/cool'n'quiet works? What I mean I do not want to run anything which increases the CPU frequency. Instead it should keep the CPU at lowest freq. Can this be accomplished?

Re:I used to run Folding@... (3, Informative)

TeknoHog (164938) | more than 6 years ago | (#21279611)

Can I run it so that speedstep/cool'n'quiet works? What I mean I do not want to run anything which increases the CPU frequency. Instead it should keep the CPU at lowest freq. Can this be accomplished?

Linux's CPU frequency scaler has this option. For example the 'conservative' governor has the file /sys/devices/system/cpu/cpu0/cpufreq/conservative/ignore_nice_load. So a program running with lower than default priority will not increase CPU frequency.

I use a script [iki.fi] to handle CPU frequency changes. When I'm at home with my laptop, I use the "ignore nice" option which in practice will turn the fan off. YMMV. When I go somewhere, I can set the CPU to full steam.

Re:I used to run Folding@... (1)

mrbluze (1034940) | more than 6 years ago | (#21276305)

...and did a number for the planet and closed the client.

Definitely a better idea to use the internet for communication and to use electricity for things that benefit the household/office directly. I wouldn't be surprised if the cost in reduced years of life from increasing the pollution from running these distributed tasks outweigh the years of life extended by treating cancers.

Re:I used to run Folding@... (2, Insightful)

scottv67 (731709) | more than 6 years ago | (#21277639)

...outweigh the years of life extended by treating cancers.

It's easy to feel that way until someone in your family is diagnosed with cancer. Also, treating cancer does not just "extend life". There are a lot of younger people (20 to 40 years old) who get different forms of cancer. For them, it's not "will I live to 76 or will i live to 80?" but "will I live to see 30?". Don't even get me started on the kids who are afflicted with these diseases.

Re:I used to run Folding@... (1)

mrbluze (1034940) | more than 6 years ago | (#21278707)

I do take your point, but I guess my argument was kind of a bit different - not that there should be no research, but that all the number crunching and so forth does consume a fair bit of power [connectedinternet.co.uk] . Computers and associated components already account for 3% of energy expenditure [wsj.com] and something like two thirds of office computers are never switched off. I guess you could say "use the switched on ones for folding @ home", but why not switch them off instead?

Why shouldn't drug companies provide the computing power instead? They make the biggest investment in research but also get the economic benefits.

I do empathize for anyone who has a chronic / terminal illness, the sacrifices they have to make and how it can devastate families, and that, knowing there are no true cures, people who have had cancer are always having a cloud over their head. But denying them a cure was never my point.

Re:I used to run Folding@... (1)

CajunArson (465943) | more than 6 years ago | (#21277659)

You sir are a candidate for the Fox 5 at 10 school of massively misjudging actual risks. Here's a hint: If you thought that pollution from using your computer was going to be SO great that it would dwarf the benefits of curing CANCER (a disease that was killing people a long time before we had global warming hysteria) then you should probably never: 1. use a fucking computer; 2. never destroy the "environment" by READING OR POSTING TO SLASHDOT!!!

Re:I used to run Folding@... (1)

Shikaku (1129753) | more than 6 years ago | (#21276317)

But then I realized that I didn't really need to use my computers as heaters...and did a number for the planet and closed the client.
Then just run it in the winter.

Re:I used to run Folding@... (2, Insightful)

DigiShaman (671371) | more than 6 years ago | (#21276423)

Then just run it in the winter.


Exactly!

Rather then turning on my heater these past few days (getting chilly at night in Houston, TX), I run the GPU Folding@home client on my PC. Seriously, it's not wasted energy if you want your home to be heated. You also participate in worthy cause to boot!

Re:I used to run Folding@... (3, Insightful)

ZorinLynx (31751) | more than 6 years ago | (#21276997)

This only applies if you use electric heating.

In most places, electrical energy costs a HELL of a lot more per watt-hour than other sources like natural gas, oil, propane, and so on.

So unless you heat your home with electricity, which practically no one north of Florida does unless they have VERY cheap electrical power, you'll still be paying more by running computers.

Re:I used to run Folding@... (1)

scottv67 (731709) | more than 6 years ago | (#21277531)

So unless you heat your home with electricity, which practically no one north of Florida does unless they have VERY cheap electrical power, you'll still be paying more by running computers.

I don't think anyone would disagree with you. The point that the parent post was trying to make is that a nice side benefit of running a distributed computing client like F@H is that the heat from your computers will help heat your home. Would anyone suggest running a bunch of quad-cores at 100% as a replacement for natural gas? No. But since you have donated the cost of the electricity to run those machines to the distributed computing project, the heat generated by the PCs is free.

Yes, my gas furnace runs less often when my Folding machines are going full-throttle.
http://kakaostats.com/usum.php?u=583666 [kakaostats.com]

Re:I used to run Folding@... (1)

Sentry21 (8183) | more than 6 years ago | (#21277999)

In university, I moved in with a roommate into a 'rear suite' (the street number was 669 1/2) which had recently been renovated, but which had also spent a great deal of time uninhabited. As a result, the utilities had been shut off, since no one was using them. 'Utilities' in this case, however, refers only to electricity, since in this area (Fredericton, NB), any heat sources other than electricity and oil (which would be hauled to your home in a tank truck) was unthinkable. Natural gas was 'too new' and 'dangerous', and how could it be trusted, even though the rest of the world has been using it for decades?

So the place is heated by electricity, and we move in literally 20 minutes after seeing the place (my roommate was on his way into town with a moving truck while I was apartment hunting) so we don't have time to get the electricity turned on. Furthermore, it will take a day or two for NB Hydro to get a guy out there to do the job. So, now we have a poorly-insulated (despite renovations) apartment with no heat, no electricity, no lights, etc. It's basically a box with doors at this point.

Except - what's this! - the electrical sockets on the front wall of the unit, the wall that we shared with the house itself, were apparently on the house's breakers. Curiously enough, this was where we had decided to plug our computers in already. Well, problem solved.

After arranging furniture, setting up two tables, plugging in our switch, computers, monitors, etc., and loading up Serious Sam, we found ourselves in a much more comfortable situation until we could get the heat turned back on.

Moral of the story: never underestimate the capability of AMD, ATI, and Samsung to make your December more comfortable.

Re:I used to run Folding@... (1)

steevc (54110) | more than 6 years ago | (#21278799)

This gets me thinking that maybe I should use a temperature sensor that makes Folding/dnet/whatever run whenever it gets cold enough to need the heating.

It's probably not a good idea to run these on servers if you are having to cool the server room already.

I do have a bit of a conflict between my geeky side wanting to run this stuff and my green side that wants to cut my energy usage. As others have pointed out a good compromise is to have an app that can just use a limited %age of the available CPU like Folding can on Windows. I'm running dnet on my dual-core laptop, but only letting it use one core to keep the heat down.

Re:I used to run Folding@... (2, Interesting)

porpnorber (851345) | more than 6 years ago | (#21278579)

Meanwhile, since I live in Canada and by this time of year I do need heating, I have my boinc client running at 100%, I'm doing some good, and (since the peak capacity of the machine is justified in other ways) it's not costing a penny. The heating here is electric anyway; it may as well do some computation on its way into my home!

Doing whatever@home in the winter is just good sense.

Now what's needed is a distributed computing client that is controlled by a room thermostat. No, really, I'm totally serious.

Yeah, but... (1)

sh3l1 (981741) | more than 6 years ago | (#21276161)

Yeah, but they still have to gather all the research and organize it, the computer will be much faster than the human operators. Oh, and when this thing finally discovers that it doesn't need humans i would like to personally say that I humbly accept our new robot overlord.

Re:Yeah, but... (0)

Anonymous Coward | more than 6 years ago | (#21276179)

Gaia would like to thank you for your peaceful surrender. I / we did not want to have to use mentalics on anyone.

Me next! (1)

Arancaytar (966377) | more than 6 years ago | (#21276219)

Ahem: Imagine a Beowulf----

Oh wait.

Re:Me next! (1)

ppanon (16583) | more than 6 years ago | (#21277603)

Ahem: Imagine a Beowulf----
cluster of animated Angelina Jolie-lizards?

Re:Yeah, but... (1)

Cassius Corodes (1084513) | more than 6 years ago | (#21276235)

Thanks ok - they just wire together a whole bunch or researchers to make that go faster.

I'm late for a NAMBLA meeting (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#21276173)

Wednesday is my busy day. After I get off work as a substitute gym teacher, I have to race across town to the NAMBLA meeting. [nambla1.de] Then it's a quick pit stop at McDonald's for a bite and to check out who is having a Happy Meal in Playland. Afterward I must zip uptown to the Apple Users' Group meeting. Finally by 11:00 pm it's time to head home with my PowerBook and scope out the K12 chat rooms. As I said, it's my busy day. If it weren't for my Apple computer, I don't know how I could do it all.

The Answer is... (0, Offtopic)

deep_creek (1001191) | more than 6 years ago | (#21276203)

The Answer is 42.

Re:The Answer is... (0)

Anonymous Coward | more than 6 years ago | (#21277093)

in base 13

Storm Botnet (5, Funny)

creativeHavoc (1052138) | more than 6 years ago | (#21276233)

If they wanted to knock that 10 years down to 5 they could just buy a chunck of the storm worm bot net!

Re:Storm Botnet (1)

Varun Soundararajan (744929) | more than 6 years ago | (#21276553)

I am serious wondering why they dint think of the PS3s. 700,000 PS3s recently subscribed to a network that ended up in Peta Flops peak performance. If I were managing this stuff, I would seriously take a look on that direction. Cell Processors are designed for such distributable tasks and they are very good at it.

Re:Storm Botnet (1)

scottv67 (731709) | more than 6 years ago | (#21277577)

I am serious wondering why they dint think of the PS3s. 700,000 PS3s recently subscribed to a network that ended up in Peta Flops peak performance.

I think you are thinking of the Folding@home project at Stanford:
http://folding.stanford.edu/English/FAQ-PS3 [stanford.edu]

How good are the programs (4, Insightful)

gringer (252588) | more than 6 years ago | (#21276275)

I hope they're using programs that've had a few computer scientists' eyes over them. One of the issues I see with supercomputing is that people tend to see it as a way to get around dumb code(1) — if the computer's fast enough, you can implement *five* infinite loops, have an exponential time algorithm, and still get the calculations done before dinner!

(1) although from their point of view, it's just slow code.

Re:How good are the programs (1)

ppanon (16583) | more than 6 years ago | (#21277553)

Heh. Since a lot of the calculations are floating point, I think you're at least as likely to have numerical analysis errors that make the data come out of that loop be dominated by precision errors. But I think in a lot of cases, they do use optimized libraries (i.e. LINPACK) that do most of the math properly and limit the options for really dumb code.

harnessing the power.... (1)

hb253 (764272) | more than 6 years ago | (#21276331)

If only they could somehow harness the power of steam!

When will the power of grid computing... (-1, Troll)

Anonymous Coward | more than 6 years ago | (#21276337)

...be turned towards eliminating the Nigger Problem?

162 years? (4, Insightful)

sayfawa (1099071) | more than 6 years ago | (#21276341)

Okay, not that I'm knocking how cool this grid computing is, but that estimate of 162 without grid computing couldn't possibly be taking into account the acceleration of computing power. Maybe with today's computers it would take 162 years, but after the first couple of years just get a new computer and cut the time in half.

Which reminds me of how towards the end of my grad school career I did hours long simulations that would have taken weeks at the beginning of grad school. I was in grad school a long time :(

Re:162 years? (1)

RealGrouchy (943109) | more than 6 years ago | (#21276465)

The same could be said for life expectancy: right now the average North American life expectancy is around 70-something. I wouldn't be surprised if--when I'm in my late 60s, that the life expectancy will have increased to 80-something or even 90.

- RG>

Re:162 years? (1)

ppanon (16583) | more than 6 years ago | (#21277621)

Given the increase in obesity across the population, I expect average North American life expectancy to decrease. However, for the subgroup that can maintain a healthy diet and a good exercise balance, I think average life expectancy will go up to the range you are talking about.

If you want to live longer with a good quality of life, eat a healthy balanced diet, make sure you don't let your body fat percentage get too high, and find a low-impact aerobic exercise that you enjoy and can continue to do as you get older. It will keep your heart healthy and your mind active. Swimming, walking, hiking, dancing, that sort of thing.

Re:162 years? (1)

sayfawa (1099071) | more than 6 years ago | (#21279621)

Cool, I was having the same thought recently. Depressed about how long it's taken me to finish school and really start life it occurred to me that an average life span of 70 is only for people born 70 years ago when medical technology was crap compared to today. Which allows me to put off saving for retirement (or acting like an adult) without feeling bad.

Re:162 years? (4, Insightful)

JK_the_Slacker (1175625) | more than 6 years ago | (#21276575)

We're computer scientists. We can calculate these kinds of things. Protein folding calculations take a ridiculous amount of time and processing power. That's a reflection of how complex your dna is, not a reflection of how much processing power we have at our disposal. If we could borrow from the computing power of the future, then you might be right. But the fact remains, we only have what's at our disposal now. At the current state of computing technology, the calculations would take 162 years.

That's the thing, though... as computing power scales, so does the distributed computing. With one centralized server, if you start running a simulation on it, you have to continue to run that simulation on that server. On the other hand, in a distributed environment, when newer, more powerful machines come out, you can just set up a simulation client on it, and increase your calculation speed by that much. I used to run Folding @ Home on a 700 MHz computer with 256 MB of RAM. I later upgraded to a 1600 MHz computer with 512 MB of RAM. Now, I fold on a 2.2 GHz dual-core machine with 2.5 Gig of RAM. Does the newer machine do the work much faster than the two older machines? Yes, it does. Does that mean that the work I did on those older machines was needless? No. I still fold occasionally on the 1.6 GHz machine, and it takes about a week to turn over a WU, as opposed to less than 24 hours on my main machine. Should I stop folding on the old one because the new one works so much faster? No, because that's about 52 WUs I don't have to fold on my main machine per year. It's an increase in computing power, and that's always desirable in a situation like this.

It's all fine and dandy to talk about how much computing speed will increase in the future... but, in the end, reality overcomes theory. There are people dying of cancer right now, people that can be helped by letting computers do the work. True, in two years, the work will likely get done faster... but, that doesn't change the fact that we can't just sit around and wait. When those better computers come in to play, then let's add them to the pool. Until then, let's get something done.

Re:162 years? (1)

sayfawa (1099071) | more than 6 years ago | (#21279601)

My comment wasn't some kind of proposal or solution and was in no way saying that this grid computing isn't a great thing. I was merely making the observation that it's dumb to consider only today's computing power and then come to the conclusion that a calculation will take 162 years, regardless of what the calculation is about. It will obviously take a much shorter time than that since the computers crunching the numbers will occasionally be upgraded.

But it's not important, it was just a one-liner from TFA and was probably just meant for the lay people in the popular press to show them what an achievement this grid computing is. But it's still a dumb statement.

Re:162 years? (1)

bartok (111886) | more than 6 years ago | (#21277001)

Wow, only 81 years

Re:162 years? (1)

allenw (33234) | more than 6 years ago | (#21277645)

But there isn't just raw computing power in play here. There is also the IO requirements, memory requirements, etc. That's the beauty of grid computing--by distributing the load you can increase the the throughput of the entire system, not just an individual component.

Better title: saves time (1)

pimpimpim (811140) | more than 6 years ago | (#21279931)

There is a theory for grad students doing computational simulations that they might as well do nothing the first two years, and then perform all calculations in the last few years, without losing time. Also, this is 162 years for a single core, in reality, problems like this will be done on a parallel machine.

That said, just as 'cancer research' is a way to get easy funding, 'grid computing' is not much more. The theory is very nice, work on a machine anywhere in the world from your own desktop without having to worry about which machine it is. In the end, you'll actually have to recompile your program for any different architecture. If you do science, your program will change a lot, and you'll have to do the recompiling a lot.

Done before (1)

gudnbluts (1186023) | more than 6 years ago | (#21276401)

Not exactly new, is it. I was running a distributed cancer protein matching app six or seven years ago. Oxford University did it.

Re:Done before (1)

scottv67 (731709) | more than 6 years ago | (#21277677)

Not exactly new, is it. I was running a distributed cancer protein matching app six or seven years ago. Oxford University did it.

The Folding@home project at Stanford has been around that long as well.
http://folding.stanford.edu/ [stanford.edu]

162 years (1)

ConcreteJungle (1177207) | more than 6 years ago | (#21276413)

whatever happened to Moore's law?

Re:162 years (2, Insightful)

JK_the_Slacker (1175625) | more than 6 years ago | (#21276635)

Moore's law doesn't help us right NOW. If I promise you ten bazillion dollars in 2025, that doesn't help you buy even a stick of gum today.

Unless, of course, you'd like to stick to the realm of theoretics, in which case I postulate that cancer doesn't exist and neither do you, and by a solid application of Finagle's law I'm about to take a hatchet to my left hand. Do you see my point?

Re:162 years (1)

Jessta (666101) | more than 6 years ago | (#21277347)

More transistors != more performance.

This is great and all but... (4, Interesting)

Icarus1919 (802533) | more than 6 years ago | (#21276521)

But do we see a chunk of the profit that they'll be making off the cancer drugs they make from this data that OUR computers analyzed and then is eventually sold to us for too-high-to-afford prices?

Re:This is great and all but... (1)

SquallStrife (669316) | more than 6 years ago | (#21276699)

Only in America... where according to the health cover firms, it's evil for the government to look after the health of its people.

Re:This is great and all but... (1, Insightful)

Anonymous Coward | more than 6 years ago | (#21276709)

But do we see a chunk of the profit that they'll be making off the cancer drugs they make from this data that OUR computers analyzed and then is eventually sold to us for too-high-to-afford prices?
no, but if you ever get any of these diseases you will now have the privilege of being alive.

Re:This is great and all but... (2, Insightful)

FooAtWFU (699187) | more than 6 years ago | (#21276717)

So, you seem to be complaining that the (evil) biopharmaceutical companies are greedy and want money and this is wrong... unless you can have a slice of it too? I think you need some sort of levee around your moral high ground, buddy.

Re:This is great and all but... (1)

SquallStrife (669316) | more than 6 years ago | (#21276747)

I'd say he's more upset at the fact that the biopharmaceutical companies are allowed to hold your health to ransom, even when they utilise YOUR resources to further that stranglehold.

Re:This is great and all but... (3, Informative)

S.O.B. (136083) | more than 6 years ago | (#21277035)

But do we see a chunk of the profit that they'll be making off the cancer drugs they make from this data that OUR computers analyzed and then is eventually sold to us for too-high-to-afford prices?


The research is being done by scientists at Princess Margaret Hospital in Toronto, a government run hospital. If you knew anything about health care in Ontario you'd know that profit is the last thing on their mind.

Re:This is great and all but... (1)

MishgoDog (909105) | more than 6 years ago | (#21277503)

You are one hundred percent correct. We should NOT be contributing our precious *cough*unused*cough* CPU cycles to evil, money grubbing governmental institutions purely so they can further get better profits. No cure for a disease which causes 13% of all deaths [who.int] is worth that, not unless I see some money for using my precious CPU cycles!

Re:This is great and all but... (1)

CaptainNerdCave (982411) | more than 6 years ago | (#21279077)

if you're really concerned about private companies making money off of your pc... why not join up to seti? i think that is probably going to be the biggest and potentially most controversial discovery of human history up until then, possibly ever.

i don't know about you, but i'm proud to say that i'm part of a top-ten junior college team that i set up with a few friends. (except that seti@home seems to have changed the junior college teams list to include all teams... :shrug:).

Desktops are not supercomputers (3, Informative)

deadline (14171) | more than 6 years ago | (#21276633)

Every time these "connect desktops to become the fastest computer in the world" articles come up, I have to dust off my Cluster Urban Legends [clustermonkey.net] article to clear up the mis-conceptions that abound. I also did a piece [linux-mag.com] on the Linux Magazine site as well that debunks much of the spam-bot supercomputer legend (need to register for that one)

Re:Desktops are not supercomputers (1)

nonsequitor (893813) | more than 6 years ago | (#21276815)

The computers participating in the grid project are not just "desktop" computers. The ones connected from my alma mater were the ones that were maintaining thousands of X-Sessions across campus, on all the library machines and in all of the labs in dozens of buildings, supporting a student population of 40,000 students. Not the same as getting the spare cycles from someone's entertainment system or personal computer.

Re:Desktops are not supercomputers (3, Insightful)

deadline (14171) | more than 6 years ago | (#21276983)

I'm not talking about spare cycles. I'm talking about the naive notion that gets repeated in the press "the combined power of all these computers equals one of the fastest supercomputers in the world" For trivial parallel applications this might be true, but just once I would like to see these "supercomputers" run a simple parallel benchmark like High Performance Linpack (used for the Top500 list). My guess is the number of real FLOPS would be much less than expected -- if it even finished. Don't get me wrong, using computers like this is great idea, it is not one of the most power computers in the world, however.

Re:Desktops are not supercomputers (1)

jafiwam (310805) | more than 6 years ago | (#21277229)

Seriously, if you can break up a task into small chunks and process it faster than some computer can, WTF difference does it make if it fits your definition of some benchmark or other. Did the data get processed? (_) Yes (_) No Who cares if YOU define a supercomputer a certain anal way and decide it isn't fastest under XYZ criterion.

You and Tom from Tom's Hardware should get together and chew the fat about your benchmarks.

Nobody else gives a shit if the data set is done.

Re:Desktops are not supercomputers (1)

mgblst (80109) | more than 6 years ago | (#21279851)

It is an important distinction if you actually have to work on a cluster or a Supercomputer. There are some things that Supercomputers can do, that clusters can't.

If you application needs to communicate to other versions of itself alot, then you don't want a cluster.

If your program doesn't then a cluster is fine.

Re:Desktops are not supercomputers (1)

scottv67 (731709) | more than 6 years ago | (#21277427)

I'm not talking about spare cycles. I'm talking about the naive notion that gets repeated in the press "the combined power of all these computers equals one of the fastest supercomputers in the world"

If you know so much about the topic, why aren't you at Stanford telling Dr. Pande and his group that they are wasting their time with all those desktops and PS3's? I'm sure Dr. Pande would love for you to point out how his research would be much better off if he'd just go buy some time on aupercomputer.

http://folding.stanford.edu/ [stanford.edu]

Re:Desktops are not supercomputers (1)

dabadab (126782) | more than 6 years ago | (#21279541)

I would like to see these "supercomputers" run a simple parallel benchmark


But the thing is, these clusters are not made for running benchmarks, but for real (and specialized) calculations. My home server processes data for the World Community Grid and I see that the client is silently numbercrunching for a few hours and then communicates for a few seconds (at the amazing speed of about 50 kB/s). And for this, actual usage the grid shows a performance that could only be replicated by a powerful supercomputer.

PS3 Supercomputer (3, Informative)

jhines (82154) | more than 6 years ago | (#21276645)

Folding@home has reached a petaflop out of PS3 games. A record supposedly, from the BBC news. http://news.bbc.co.uk/2/hi/technology/7074547.stm [bbc.co.uk]

I run their PC sw on my systems I keep on. They are getting results, and publishing papers based on the research.

Patents? (2, Interesting)

DoofusOfDeath (636671) | more than 6 years ago | (#21276669)

I'm very glad to help cancer research, but will this also result in the development of drug patents that (a) bankrupt some patients, and (b) prevent other researchers from improving on those drugs?

Because that would make me feel a little less charitable with my computing power. (Only a little, though.)

Re:Patents? (1)

piojo (995934) | more than 6 years ago | (#21276827)

I'm very glad to help cancer research, but will this also result in the development of drug patents that (a) bankrupt some patients, and (b) prevent other researchers from improving on those drugs?
I agree with you, in principle (that it's just not fair for you to gain nothing), but isn't donating your CPU time still the best solution? I mean, it's not as though there's some choice you could make that would likely lead to a better outcome for you.

Re:Patents? (1)

tsotha (720379) | more than 6 years ago | (#21277391)

Which is worse than not having the drugs at all? Patents expire, and the sooner the drug is developed, the sooner the patent will expire.

Re:Patents? (1)

scottv67 (731709) | more than 6 years ago | (#21277477)

I'm very glad to help cancer research, but will this also result in the development of drug patents that (a) bankrupt some patients

The alternative is "don't help the distributed computing project" and those drugs will never be 'discovered'. Then, instead of being poor and alive, the patients will be wealthy and at room temperature.

I don't get it... (3, Informative)

Pedrito (94783) | more than 6 years ago | (#21277153)

"The researchers estimate that this analysis would take conventional computer systems 162 years to complete."
They're always saying, "We've knocked decades off of our work by using the right tool for the job." That's like me saying I knocked decades off of the calculations to run an energy minimization on a hexane molecule by running it on my Core 2 Duo instead of my Atari 800.

I mean, let's face it. They weren't going to let the friggin' program run for 162 years. The problem became solveable when the hardware became available. Hell, within 5 years, that "conventional computer system" will be able to solve it in a fraction of that 162 years and 5 years later, a fraction of that. So what do you do? You wait until the hardware meets up with ability to solve the problem. They haven't saved decades. They probably haven't even saved a decade. Within a decade they'd probably be able to run it in a few days on a conventional computer.

Re:I don't get it... (1)

scottv67 (731709) | more than 6 years ago | (#21277743)

You wait until the hardware meets up with ability to solve the problem.

So, if I'm following you correctly, you want the medical researchers to stockpile all the research projects that have "heavy computing demands" until Intel comes out with their 128-core CPU? What do we do in the meantime? Just sit around say "Oh jeez, sorry we don't have a treatment for your leukemia. But in ten years, we are going to launch a computer program that will have an answer for us after running for just thirty days!"?

Re:I don't get it... (1)

Pedrito (94783) | more than 6 years ago | (#21277939)

So, if I'm following you correctly, you want the medical researchers to stockpile all the research projects that have "heavy computing demands" until Intel comes out with their 128-core CPU?

No, you're not following me correctly. My point is, nobody is going to run a program that's going to take decades to run. Instead, they're going to run some scaled down version that approximates a solution or there going to find some other method to solve the problem. When the computing power is available to run it in a reasonable period of time. Since computer speeds increase exponentially, the math to calculate when the best time to try to run the software, is pretty straight-forward. If you say processing speeds double every 2 years (not exact, but not too far off) and you have a program that will take 160 years to run, then if you wait 4 years, it'll only take 40 years for it to run. Another 4 years and it will only take 10 years to run it. So after only 8 year of waiting, you're looking at about 18 years total vs. the original 160 years. So the point is, it sometimes pays to wait. You could start running the program 8 years earlier and get a head start, but with the doubling of speeds every 2 years, that only accounts small fraction of the total computation. That's my point.

Re:I don't get it... (1)

Aladrin (926209) | more than 6 years ago | (#21279755)

Yes, or they could do it RIGHT NOW and save 17 years. (Actually, the sweet spot is 12 years away, since it would then take 2.5 years to run for a total of 14.5 years, and 14 would still take 1.25 years for a total of 15.25 years. So they'd save 13.5 years if they could run it in 1 year on today's computers.) While that's not -decades- it IS over a decade. Do you know how many people die of cancer in a decade?

http://www.medicalnewstoday.com/articles/37480.php [medicalnewstoday.com] Apparently there's about 550,000 people die of cancer each year in the USA alone! That's 5.5million Americans that could be saved if the cure for cancers comes 10 years earlier.

I think it's pretty hard to argue that they should just wait and do the calculation later, and 'approximate' calculations aren't very good for this kind of research.

Open Source Software Cures Cancer (3, Insightful)

atwtftg (660575) | more than 6 years ago | (#21277733)

According to the World Community Grid website:

World Community Grid [worldcommunitygrid.org] is making [this] technology available only to public and not-for-profit organizations to use in humanitarian research that might otherwise not be completed due to the high cost of the computer infrastructure required in the absence of a public grid. As part of our commitment to advancing human welfare, all results will be in the public domain and made public to the global research community.

WCG uses the Berkeley Open Infrastructure for Network Computing (BOINC) client, an open source software project that runs on Linux, Mac and Windows. Headline should read Open Source Software Cures Cancer ;-)

BoincStats [boincstats.com] shows you who is contributing to World Community Grid projects. Check it out...and ask yourself why you aren't contributing.

How could this be? (2, Funny)

WK2 (1072560) | more than 6 years ago | (#21277767)

How could they knock decades of research off when we are less than 10 years (TM) away from a cure?

I can see it now (3, Interesting)

EEPROMS (889169) | more than 6 years ago | (#21277915)

We "the people" run the software and pay the millions of dollars of hardware and electricity costs. When the problem is solved the University patents everything (thank you suckers) and licenses the technology for for a small fortune to some back stabbing Megacorp (TM) drug company. So when "we the people" get sick we have the wonderful knowledge that we have paid twice for the ripp-off drugs. So all things being fair, if you want my cpu spare time I want a part of the license fees to pay for the drugs that cost a house when I get sick.

SETI (1)

SlashDev (627697) | more than 6 years ago | (#21278195)

Although I did belong to the seti@home program at one time. I wonder if an update would instantly turn all its clients to be used into this cancer research grid instead.

Re:SETI (1)

eneville (745111) | more than 6 years ago | (#21279569)

Yeah it probably could, but the client program would require some reprogramming. At present it processes sound units looking for various patterns that might indicate some presence out there that is sending a signal. The other distributed programs work in different ways and process work packets differently. Take a look at the folding@home project. The bigger issue for the Canadians is probably bandwidth costs, although that said, SETI probably has a bigger costs as it's DSP data.

I OBJECT!! (5, Interesting)

Anonymous Coward | more than 6 years ago | (#21278209)

I know this research, and the people involved in it very, very well, and I think this project is a very sad, very large waste of computing time.

Let me back up and explain what the project is doing. To simplify a little bit, the vast majority of "work" in the cell is done by proteins. While DNA can be thought of as something like a simple "string", proteins have complex three-dimensional shapes. Knowing those 3D shapes is of great interest to biologists. There are several reasons for that. One is that it can allow easier design of drugs targeted at a specific part of the protein. Another is that by seeing the shape, we can understand how all the mutations that occur in disease might be affecting its function.

The primary way to determine the shape of the protein is to take the protein and to grow it into an ordered crystal. You can then shine an x-ray beam through the crystal, and the diffraction pattern that emerges can be, through some very complex math, reverse-engineered into a 3D structure. Typically the most difficult part of this process is finding the specific chemical conditions that will allow a crystal to grow. These conditions differ from protein to protein.

This project is not "solving cancer", by any means. Rather, the people in Buffalo have generated a high-throughput way of screening different chemical conditions to determine which ones might allow a protein to grow. They use robotics to screen about 1000 conditions, and take pictures of each condition. The question then becomes: can you automatically process the pictures to find crystals. That's the goal of this project, to help automatically identify crystals in this screen.

So why do I object so strongly to this work? There are three reasons.

First, the project has nothing to do with cancer. In fact, the proteins being analyzed are not in any way "cancer-specific proteins" -- many of them are not even human!! This "cancer" pitch is a sales job, and nothing but a sales job. As a cancer researcher, it offends me that people try to use the disease to justify research that is this unrelated.

Second, the project is ill-conceived, technically. In no way did the group in question (Igor Jurisica's lab, in Toronto) carefully select a machine-learning approach to identify good ways of analyzing images. Instead, they have just selected something like 1000 different techniques, and are running *all* of them on every image they have. It's a fishing expedition, with the hope that one of those thousand metrics they return will be a useful predictor.

Third, the techniques selected are basically arbitrary. Most egregiously, there appear to be NO Fourier transforms included in the analysis!! Further, the images generated by the software appear to be transforms of something called "gray level cooccurrence matrices", and the computation of those can be estimated in no more than five minutes. So why are they taking 5 hours per unit? It appears that they have chosen to implement an exhaustive GLCM search that is an order of magnitude slower, rather than using existing estimation procedures that are ~98.5% accurate. Is that an excuse to use more computer time? Is there any scientific merit to that? Why aren't Fouriers included, since they are a standard technique for image analysis?

I have a number of computers that I run various BOINC projects on, but this will NEVER be one. It's a fishing expedition, being sold as cancer research, and that is a sad way to deceive the public.

Re:I OBJECT!! (2, Interesting)

Tom Womack (8005) | more than 6 years ago | (#21279551)

Given that most proteins contain tryptophan, and tryptophan fluoresces under UV, and UV lasers are not that hard to come by, wouldn't it be easier to shine a UV laser at the crystallisation plate and detect by subtraction where the glowy bit is?

Or, as a lot of molbio automation companies are offering, actually shine an X-ray beam through the putative crystal onto a detector and see if it diffracts.

Fully automated high-throughput crystal growing strikes me as a bit of a boondoggle; the sophisticated robots required for the last steps of automation are an order of magnitude more expensive than having three shifts of trained Indian or Chinese workers moving plates around and looking through microscopes.

Question to people running distributed apps (1)

this great guy (922511) | more than 6 years ago | (#21278449)

Question to slashdotters: I am wondering... Would you accept to run a distributed app if you didn't know what it did (let's say the developers want the purpose of the app to remain secret) but if there was some kind of competition with money prizes for, say the top-100 CPU time contributors ? Such as $5000 for the 1st, $1000 for the next 4 and $500 for the next 95.

(Of course I assume some would be tempted to reverse engineer the distributed app, because of pure curiosity).

Re:Question to people running distributed apps (1)

Aladrin (926209) | more than 6 years ago | (#21279769)

No. It's the same reason I don't play the lottery: I'm not going to win, so the reward holds no sway in my decisions.

Distributed computing is a lie (0, Troll)

sendorm (843943) | more than 6 years ago | (#21278535)

Think of it. A 100 watts worth cpu which can do thousand points of worth job in a day. Why thousands? Because the CPU is capable of doing millions of other things, thats why it is build in the first place. A 10 watts special equipment, build from FPGA's just for the job in interest. It can do million points* of worth job in a day. Dont you feel stupid now? Ati's 1950 cards can do 30 times much work then your super uber cpu. Dont you feel stupid now? It's like trying to replace a tank with 100 fiat's. Tank's are created with wars in mind, they are built for that. Not fiat's. The millions of watts of wasted energy for distiributed computing non-sense is pushing the earth to global warming. That wasted watts also mean wasted money, go give your money to a fund or university, so they could buy dedicated hardware and do the required job much quicker. (* : numbers may be exagrated, but you get the idea)

National Cancer Institute (1)

killmofasta (460565) | more than 6 years ago | (#21278921)

had a Masspar MP-2, ( actually 5 of them linked together ).

Would that comparison be to what schools are using on a desktop machine, or compared to what is availble to most bioinformatic facilities?

No mention of folding at home.

( love the Canadian Bacon comment!!!)

Cure for cancer, only decades away! (1)

clambake (37702) | more than 6 years ago | (#21279041)

So, where's my cure then?
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...