×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Wielding Supercomputers To Make High-Stakes Predictions

timothy posted more than 2 years ago | from the but-sir-this-research-could-explode dept.

Supercomputing 65

aarondubrow writes "The emergence of the uncertainty quantification field was initially spurred in the mid-1990s by the federal government's desire to use computer models to predict the reliability of nuclear weapons. Since then, the toll of high-stake events that could potentially have been better anticipated if improved predictive computer models had been available — like the Columbia disaster, Hurricane Katrina and the World Trade Center collapse after the 9/11 terrorist attacks — has catapulted research on uncertainty quantification to the scientific and engineering forefronts." (Read this with your Texas propaganda filter turned to High.)

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

65 comments

Obligatory XKCD (-1, Offtopic)

Anonymous Coward | more than 2 years ago | (#38302310)

Re:Obligatory XKCD (-1)

Anonymous Coward | more than 2 years ago | (#38302758)

C'mon, why are you posting that link in just about every thread?

Just trying to get people to waste mod points modding you down?

"Texas propaganda filter"???? (3, Insightful)

SrJsignal (753163) | more than 2 years ago | (#38302350)

...thanks a lot....Ass

Re:"Texas propaganda filter"???? (1)

ByOhTek (1181381) | more than 2 years ago | (#38302494)

I read the article, and my first thought was, yeah, there is a lot of arrogance and pride. But I also looked at the specs of the super computer they are talking about. Half the local disk of the super computer we have a my workplace, but probably 2.5 times the CPU horsepower, and 50% more overall memory. We aren't known for being slouches in that department, so UT might have some reason for that arrogance.

I'll be interested to see what the comparison looks like when out supercomputer is upgraded at the end of the month. Merry Christmas to us!

Re:"Texas propaganda filter"???? (0)

Anonymous Coward | more than 2 years ago | (#38302546)

I read the article, and my first thought was, yeah, there is a lot of arrogance and pride.

Right, a lot of arrogance and pride just like this little gem:

Few universities are better equipped than The University of Texas at Austin at doing so.

Sounds a little arrogant.

But I also looked at the specs of the super computer they are talking about. Half the local disk of the super computer we have a my workplace, but probably 2.5 times the CPU horsepower, and 50% more overall memory. We aren't known for being slouches in that department, so UT might have some reason for that arrogance.

I'll be interested to see what the comparison looks like when out supercomputer is upgraded at the end of the month. Merry Christmas to us!

Are you getting tens of millions of dollars from the NSF and DOE like UT at Austin is?

What about this article is newsworthy? Is UT making some huge strides in predictive modeling or just building pretty big computers to do it?

Re:"Texas propaganda filter"???? (0)

Anonymous Coward | more than 2 years ago | (#38302626)

Are you getting tens of millions of dollars from the NSF and DOE like UT at Austin is?

Probably, not sure where the funding is coming from, but even if it comes from various research groups, it can probably be traced back to grants from the NSF, DoD and DoE.

What about this article is newsworthy? Is UT making some huge strides in predictive modeling or just building pretty big computers to do it?

Not sure really... If seemed more fluff and speculative on the former, and the latter is more of a 'hey, look at me!' bit of nonsense.

federal government's desire to use computer models (5, Interesting)

mcgrew (92797) | more than 2 years ago | (#38302654)

What was old is new again. [usatoday.com]

In a few hours on Nov. 4, 1952, Univac altered politics, changed the world's perception of computers and upended the tech industry's status quo. Along the way, it embarrassed CBS long before Dan Rather could do that all by himself.

Computers were the stuff of science fiction and wide-eyed articles about "electric brains." Few people had actually seen one. Only a handful had been built, among them the first computer, ENIAC, created by J. Presper Eckert and John Mauchly at the University of Pennsylvania in the 1940s.

In summer 1952, a Remington Rand executive approached CBS News chief Sig Mickelson and said the Univac might be able to plot early election-night returns against past voting patterns and spit out a predicted winner. Mickelson and anchor Walter Cronkite thought the claim was a load of baloney but figured it would at least be entertaining to try it on the air.

On election night, the 16,000-pound Univac remained at its home in Philadelphia. In the TV studio, CBS set up a fake computer -- a panel embedded with blinking Christmas lights and a teletype machine. Cronkite sat next to it. Correspondent Charles Collingwood and a camera crew set up in front of the real Univac.

By 8:30 p.m. ET -- long before news organizations of the era knew national election outcomes -- Univac spit out a startling prediction. It said Eisenhower would get 438 electoral votes to Stevenson's 93 -- a landslide victory. Because every poll had said the race would be tight, CBS didn't believe the computer and refused to air the prediction.

Under pressure, Woodbury rejigged the algorithms. Univac then gave Eisenhower 8-to-7 odds over Stevenson. At 9:15 p.m., Cronkite reported that on the air. But Woodbury kept working and found he'd made a mistake. He ran the numbers again and got the original results -- an Eisenhower landslide.

Late that night, as actual results came in, CBS realized Univac had been right. Embarrassed, Collingwood came back on the air and confessed to millions of viewers that Univac had predicted the results hours earlier.

In fact, the official count ended up being 442 electoral votes for Eisenhower and 89 for Stevenson. Univac had been off by less than 1%. It had missed the popular vote results by only 3%. Considering that the Univac had 5,000 vacuum tubes that did 1,000 calculations per second, that's pretty impressive. A musical Hallmark card has more computing power.

Re:"Texas propaganda filter"???? (2)

Overunderrated (1518503) | more than 2 years ago | (#38302906)

Don't read too much into it. It's a utexas.edu news bite about an upcoming large research project at... UTexas. Literally every major university's website has a front page link about a big new research project on campus.

Re:"Texas propaganda filter"???? (0, Offtopic)

Anonymous Coward | more than 2 years ago | (#38302614)

No worries, I had my Timothy Idiocy Filter on while reading the summary.

Re:"Texas propaganda filter"???? (0, Informative)

Anonymous Coward | more than 2 years ago | (#38302732)

No kidding... People treat Texans so badly, but it's probably just jealousy. After all we invented the integrated circuit, single cell DRAM, and the single chip DSP. Oh don't forget the successful IBM PC competitors Compaq and Dell.

And as you sit there munching on your hamburger and sipping on Dr Pepper, remember those were invented in Texas too.

Re:"Texas propaganda filter"???? (0)

Anonymous Coward | more than 2 years ago | (#38303560)

No, maybe because they are a bunch of annoying braggarts?

Texas is damn near the bottom of places I'd want to be in the US. Mind you it's higher up than the penis (Florida), ass (California) or liver (Tennessee)

Re:"Texas propaganda filter"???? (1)

Panaflex (13191) | more than 2 years ago | (#38312014)

I've lived in quite a few places... I attended 10 schools before graduating from High School!) What's your favorite state?

Yes, Texans brag... almost as much as the Irish I think.

Re:"Texas propaganda filter"???? (1)

Kozar_The_Malignant (738483) | more than 2 years ago | (#38303836)

Texas also gave us George W. Bush and Rick Perry. Dr. Pepper tastes like fizzy cough syrup. I did my 40 hour Hazwopper training in Fort Worth in July. I have bad memories.

Re:"Texas propaganda filter"???? (1)

SuricouRaven (1897204) | more than 2 years ago | (#38304090)

The hamburger is old. Really old. You could buy those back in the Roman Empire, though they did like theirs seasoned with a little fish oil. It doesn't take a cooking breakthrough to come up with the idea of putting a disc of meat in between two discs of bread.

Re:"Texas propaganda filter"???? (0)

Anonymous Coward | more than 2 years ago | (#38307872)

If you're thinking of ISICIA OMENTATA then I would say that's more of a grilled meatloaf.

Re:"Texas propaganda filter"???? (1)

tehcyder (746570) | more than 2 years ago | (#38313098)

And as you sit there munching on your hamburger and sipping on Dr Pepper, remember those were invented in Texas too.

Not to mention fire, the wheel and pole dancing.

Re:"Texas propaganda filter"???? (0)

Anonymous Coward | more than 2 years ago | (#38323786)

No... let's give credit where credit is due.

Fire is a non-invention, has thus has no intellectual property rights.
The wheel was invented in Australia, at least according to the patent office.
And pole dancing was invented by the world's second oldest profession, Politics.

Re:"Texas propaganda filter"???? (0)

Anonymous Coward | more than 2 years ago | (#38303842)

moderation removal, nothing to see here, move along

Re:"Texas propaganda filter"???? (1)

forkfail (228161) | more than 2 years ago | (#38303948)

And here I thought it was a reference to what Texas is doing to textbooks...

Re:"Texas propaganda filter"???? (0)

Anonymous Coward | more than 2 years ago | (#38304978)

Oh please.
Do we say "California propaganda filter"?
Of course not, because the comment is just ridiculous.

I understand people regretting the influence on textbooks that a couple of idiots here in Texas have over the nation, but the other is just silly.

Like Global Climate Change ? (5, Interesting)

RichMan (8097) | more than 2 years ago | (#38302444)

Seems to me all the supercomputer models are predicting the disaster called global climate change is powered by human CO2 emissions. We have predicted it. It has a decided human cause against which we can take direct action. Over the next 50 years billions of people will be displaced. Trillions or more of infrastructure will be lost to rising oceans.

Are we doing anything? Seems to me the whole prediction thing is useless if we are unwilling to take action on the results.
Is it because the results are wrong or is it because it involves money in peoples pockets.

We can make the predictions, we need to remove the barriers to action.

Re:Like Global Climate Change ? (0)

Anonymous Coward | more than 2 years ago | (#38302712)

The same supercomputers that use CO2 producing electricity. Like quantum physics, you change the outcome when you observe. Only this time, you have no clue what you're really looking for in a model that may or may not be correct. Astounding!

Re:Like Global Climate Change ? (0)

Anonymous Coward | more than 2 years ago | (#38302848)

Not sure what you're smokin'

perhaps you meant 'over the next 50 billion years people will be displaced'. That I can buy.

Re:Like Global Climate Change ? (2)

khallow (566160) | more than 2 years ago | (#38302964)

Over the next 50 years billions of people will be displaced.

By what? Those supercomputer models aren't predicting that global warming will be the cause.

Re:Like Global Climate Change ? (1)

Anonymous Coward | more than 2 years ago | (#38302992)

There is a strange dichotomy with humans with respect to the cause of an event being by other humans or by nature (as if they are different). If 30,000 people die due to a flood or earthquake, a couple of years later it will just be known as The flood of 2001 or forgotten entirely. If 3,000 people die due to a human plot, its remembered and re-remembered periodically on the exact date. Just yesterday, we are remembering the 70th anniversary of Perl Harbor. For example, the recent quake in Haiti killed over 300,000 people. Granted its not in the US so nobody cares, but thats a lot of people to be taken out by a single event.

Re:Like Global Climate Change ? (1)

khallow (566160) | more than 2 years ago | (#38305740)

There is a strange dichotomy with humans with respect to the cause of an event being by other humans or by nature (as if they are different).

It's not strange when you figure that a human-caused event can be repeated frequently, if the culprits can get away with it. Natural disasters don't increase with frequency, if one happens.

In elementary terms, I think it has to do with the Prisoners' Dilemma. As you probably recall, the game is set up so that if two people cooperate they do ok. If one cooperates and one cheats, then the cheater does a lot better. And if both cheat, then both suffer worse than if they had both cooperated (but not as a cooperator scammed by a cheater).

Society faces these sort of decisions all the time. I could respect the laws of the land or go Grand Theft Auto. We could ignore each other in the elevator or fight to take each others' stuff.

We can change a prisoners' dilemma situation into a different game by penalizing the cheating enough. That's where actions that otherwise appear exaggerated come in. The punishments for theft, murder, and other harmful behaviors are made relatively high so that rational people won't engage in them. That leads to a society that is mostly cooperative.

It's also worth noting that if someone finds a way to profit, real or imaginary, from murder even after punishment is factored in, then that is a particularly dangerous situation. One doesn't sit on that and assume it's just a random occurrence. For example, the 911 attacks were one time. But the planners had plenty of man-power to commit future attacks. It'd be foolish to assume in the absence of preventative measures that Al Qaeda or whoever wouldn't have scaled up the attacks once they had a few successes.

Re:Like Global Climate Change ? (0)

Anonymous Coward | more than 2 years ago | (#38303026)

Using facts and data like that is pure grade A soocalist taulk. Facts have a well known lib'rul bias. Why do you hate 'merkins?!!!

Re:Like Global Climate Change ? (1)

RichMan (8097) | more than 2 years ago | (#38303812)

Global climate change will lead to sea level rises. This will displace people.
It is rising, sea graph at http://www.climate.org/topics/sea-level/index.html#sealevelrise

The US has 57,000 km^2 of land below 1.5m above sea level. Another 33,000km^2 of land between 1.5m and 3.5m. (Source epa.gov)

"Over 600 million people live in coastal areas that are less than 10 meters above sea level, and two-thirds of the world’s cities that have populations over five million are located in these at-risk areas (http://www.climate.org/topics/sea-level/index.html#sealevelrise) Low-lying coastal regions in developing countries such as Bangladesh, Vietnam, India, and China have especially large populations living in at-risk coastal areas such as deltas. "

"Shanghai, altitude roughly 3 meters (10 feet) above sea level, is among dozens of great world cities — including London, Miami, New York, New Orleans, Mumbai, Cairo, Amsterdam and Tokyo — threatened by sea levels that now are rising twice as fast as projected just a few years ago, expanding from warmth and meltwater."

GIGO (0)

harvey the nerd (582806) | more than 2 years ago | (#38303840)

Another old computer rule, Garbage In, Garbage Out. Penn State buggers the data, and forgets to finish the whole energy equation with unmodeled terms, like non-radiant solar energy. We are more likely to experience unusual temperature declines across the next 30 years according to more predictive OLD models.

Re:Like Global Climate Change ? (1)

forkfail (228161) | more than 2 years ago | (#38304196)

To ignore the oncoming train is, unfortunately, pretty ingrained into human nature, and has been for some time.

See also Cassandra.

The better the prediction ... (0)

Anonymous Coward | more than 2 years ago | (#38302454)

... the worst the outcome. That's why it is a black swan. The best way to make the world stable is to randomize the input. For example, the fed can adjust the probability of a rate hike, but the actual event is random. They can publish the probability, but the final outcome will just be a lottery. But this is too much for most peoples, particularly politicians.

Who computes the computer? (1, Troll)

VortexCortex (1117377) | more than 2 years ago | (#38302506)

Since then, the toll of high-stake events that could potentially have been better anticipated if improved predictive computer models had been available — like the Columbia disaster, Hurricane Katrina and the World Trade Center collapse after the 9/11 terrorist attacks — has catapulted research on uncertainty quantification to the scientific and engineering forefronts

How sure are we that the tolls could have been better anticipated?

We should leverage a super computer to calculate the potential that each high-stake event can be better anticipated by a super computer model. Then simply pool our resources and use greater predictive computing power for the events we have the most potential to anticipate.

I put it to you that once such a model can be computed, it will be trivial to use predictive computer models to determine which super computer will predict the the most accurate results. Thus, we can leave it alone to the task of the predicting, knowing that it has a potentially better chance of anticipating the anticipation.

Furthermore, we could use the predictive models to better anticipate which researcher will be able to quantify the amount of uncertainty quantification needed to quantify quantum uncertainty; They would also be the ones who could finally tell us what quantity of Schrödinger's cat is undead.

Chaotic Systems (3, Insightful)

Tokolosh (1256448) | more than 2 years ago | (#38302610)

Some things can be well-modeled by using good input data and fine-grained analysis, which may require supercomputers.

A problem arises when inherently chaotic (in the mathematical sense) systems are modeled. No amount of computing power will improve the quality of the results.

It may be hard to know what type of system you are dealing with.

And by definition, black swans cannot be modeled at all.

I'm not so sure of that... (1)

Anonymous Coward | more than 2 years ago | (#38302756)

And by definition, black swans cannot be modeled at all.
 
... because after all these years, I'd still let Natalie Portman [imdb.com] model me with a bowl of hot grits anyday.

Re:Chaotic Systems (2)

Overunderrated (1518503) | more than 2 years ago | (#38302880)

As a researcher in computational sciences (and chaotic systems) similar to the news blurb posted here, no. There are many methods for dealing with chaotic/uncertain inputs. Monte Carlo approaches, for example.

Re:Chaotic Systems (1)

Tokolosh (1256448) | more than 2 years ago | (#38305652)

Ok, so I assume you use a Monte Carlo technique to generate probabilities of outcomes. But does having supercomputers improve the accuracy of the results, with any certainty?

Re:Chaotic Systems (2)

Overunderrated (1518503) | more than 2 years ago | (#38305778)

Ok, so I assume you use a Monte Carlo technique to generate probabilities of outcomes. But does having supercomputers improve the accuracy of the results, with any certainty?

Well yes, of course. That's the entire purpose of uncertainty prediction, and HPC simulations in general. In any kind of complex numerical simulation (say, turbulent aerodynamics), the accuracy with which you can simulate a given physical situation is entirely constrained by your available computational power. You must find a balance between the level of detail you need versus computer power available (i.e., direct numerical simulation of turbulence for a full-sized aircraft is both entirely unfeasible computationally, and provides drastically more information than is necessary for analysis). These are well-studied problems, and not "guesswork" as you seem to imply. Extending that to a situation where the inputs themselves are uncertain clearly is a situation where more computer power leads to more accurate results. Basic statistics tells you the more data points you have (in this case, results from a range of inputs), the more reliable your predictions of uncertainty.

Re:Chaotic Systems (0)

Anonymous Coward | more than 2 years ago | (#38305282)

If something is defined by being "unpredictable", that just means the events covered by the definition reduce as prediction gets better.

People don't understand models and computers (2)

statdr (1585151) | more than 2 years ago | (#38302680)

Having a supercomputer won't help predict rare events unless you have a particular mathematical model for those events already (see physics). If you don't have a model for how rare events occur (terrorism events, natural disasters) then a computer (of any type) won't help you predict them. If you want to build a model then you needs lots and lots of events (and nonevents) and associated data to try and build a model. If you have a lot of data, perhaps you'd need a supercomputer to investigate the interim models you come up with before you arrive at a final model. You can't predict rare events. Modeling, and statistics in general, is designed to make statements about things given sufficient data. By definition, a one-off event or extremely rare event doesn't provide enough data to allow generalization (or inference).

Predictions are only as good as the models... (3, Insightful)

divisionbyzero (300681) | more than 2 years ago | (#38302686)

The only way to validate the model is to apply it and see if it works. The problem with hish risk disasters is that they don't happen all that often so it's hard to validate the model. I mean sure you can special case it to death to get it to predict "the Columbia disaster, Hurricane Katrina and the World Trade Center collapse" but if you special case it too much it loses predictive ability for similar but not identical events.

The reason there is so much "uncertainty" (not for me but many others) around climate change is that it is practically a singular event that'll occur 50-100 years in the future. Of course the models can be validated as we go but how much validation is enough? When it's too late?

Re:Predictions are only as good as the models... (1)

Registered Coward v2 (447531) | more than 2 years ago | (#38302800)

The only way to validate the model is to apply it and see if it works. The problem with hish risk disasters is that they don't happen all that often so it's hard to validate the model. I mean sure you can special case it to death to get it to predict "the Columbia disaster, Hurricane Katrina and the World Trade Center collapse" but if you special case it too much it loses predictive ability for similar but not identical events.

True, but, despite the /. summary, the article really isn't about predicting events so much as trying to assess the level of uncertainty around the results o fteh model. By quantifying the uncertainty you can better use the results to decide what to do. Essentially, you want to be able to say "I think this will be the outcome, but I am only so sure about the accuracy of my prediction." It's not really about predicting the future (in the sense of "what event will occur" but what will happen if x occurs and how certain am I about that result.

Re:Predictions are only as good as the models... (1)

divisionbyzero (300681) | more than 2 years ago | (#38311090)

The only way to validate the model is to apply it and see if it works. The problem with hish risk disasters is that they don't happen all that often so it's hard to validate the model. I mean sure you can special case it to death to get it to predict "the Columbia disaster, Hurricane Katrina and the World Trade Center collapse" but if you special case it too much it loses predictive ability for similar but not identical events.

True, but, despite the /. summary, the article really isn't about predicting events so much as trying to assess the level of uncertainty around the results o fteh model. By quantifying the uncertainty you can better use the results to decide what to do. Essentially, you want to be able to say "I think this will be the outcome, but I am only so sure about the accuracy of my prediction." It's not really about predicting the future (in the sense of "what event will occur" but what will happen if x occurs and how certain am I about that result.

Yeah, I hate to go all Taleb but that doesn't make sense from a Black Swan point of view. The catastrophes outside the model are always the worst catastrophes because they are outside the model.

Re:Predictions are only as good as the models... (1)

Registered Coward v2 (447531) | more than 2 years ago | (#38313200)

The only way to validate the model is to apply it and see if it works. The problem with hish risk disasters is that they don't happen all that often so it's hard to validate the model. I mean sure you can special case it to death to get it to predict "the Columbia disaster, Hurricane Katrina and the World Trade Center collapse" but if you special case it too much it loses predictive ability for similar but not identical events.

True, but, despite the /. summary, the article really isn't about predicting events so much as trying to assess the level of uncertainty around the results o fteh model. By quantifying the uncertainty you can better use the results to decide what to do. Essentially, you want to be able to say "I think this will be the outcome, but I am only so sure about the accuracy of my prediction." It's not really about predicting the future (in the sense of "what event will occur" but what will happen if x occurs and how certain am I about that result.

Yeah, I hate to go all Taleb but that doesn't make sense from a Black Swan point of view. The catastrophes outside the model are always the worst catastrophes because they are outside the model.

True, but they aren't trying to predict Black Swan events, at least not from my RTF. Take Columbia, for example. If, after running the damage model, that had said "there is only a 20% chance we are right - i.e. there is a high degree of uncertainty surrounding our results" then NASA would have known to further analyze the situation. They weren't trying to say "there is x% chance a shuttle will suffer damage on launch the results in catastrophic failure of the vehicle on reentry."

By being more certain about the uncertainty you can better use the model's results to make decisions. You can also the better use the model to try to estimate the outcomes of rare events - such as what might happen if we experience a Cat 5 hurricane, how much time do we have if we wipe out the backup diesels, etc. You can't say what are the chances of the initiating event but should be able to better assess the likelihood of the outcomes.

Scepticism... (4, Insightful)

shic (309152) | more than 2 years ago | (#38302730)

I like supercomputers in the same way I like architectural monuments - there's an element of beauty in stretching technology to ever more extreme goals, but I'm far from convinced that there's an objective, practical, point to any of the calculations they make.

I'm very sceptical about climate change prediction - because, without any calculation, it's blindingly obvious that climate will change (all evidence suggests vast changes throughout history) and - because mankind is significant among life on earth - obviously we should assume a fair chunk to be 'man made'. I seldom see the questions that matter addressed... for example, in what ways can we expect climate change to be beneficial to mankind? When we ask the wrong questions, no matter how large-scale or accurate our computation, it will be worthless. Don't get me wrong, I see immense value in forecasting... but I don't see available computational power as a limiting factor... in my opinion there are two critical issues for forecasting: (1) collecting relevant data accurately; (2) establishing the right kind of summaries and models. While some models are computationally expensive - in my opinion - the reason for attempting to brute-force these models has far less to do with objective research and far more to do with political will to have a concrete answer irrespective of its relevance... The complexity of extensive computation is exploited to lend an air of credibility, in most cases, IMHO.

"Don't worry about the future. Or worry, but know that worrying is as effective as trying to solve an algebra equation by chewing bubble gum. The real troubles in your life are apt to be things that never crossed your worried mind, the kind that blindside you at 4 p.m. on some idle Tuesday."

The reason is simple: avoidable disasters occur not because we haven't done enough calculations - but because the calculations we do are done for the wrong reasons and produce irrelevant results. If we want to move forwards, we need more observation and more intelligent consideration. Iterating existing formulas beyond the extent possible with off-the-shelf technology, IMHO, is unlikely to yield anything significant.

Re:Scepticism... (1)

Dynetrekk (1607735) | more than 2 years ago | (#38312504)

(1) collecting relevant data accurately; (2) establishing the right kind of summaries and models.

Yes, you are right. But due to sensitivity to initial conditions and a positive Lyapunov exponent [wikipedia.org] , the number of days you are able to forecast scales only logarithmically with your computing power, even with near-perfect knowledge of the initial conditions. So yes, bigger is better when it comes to weather prediction.

Re:Scepticism... (1)

shic (309152) | more than 2 years ago | (#38314308)

The diminishing returns implied by the Lyapunov exponent definitely lend credibility to my claim that much of supercomputing is objectively pointless, but I was anxious not to focus upon only one of the ways in which calculations might be irrelevant.

I'd agree that "bigger is better" - but only if we exclude cost from our assessment.With significant financial overheads for marginal improvement in accuracy, I have to wonder - at the extremes of industry practice - might the same funding might been more effectively deployed otherwise? Might a better strategy be to simply accept the limits of inexpensive computing, and focus on finding more effective approaches to practical problems?

Re:Scepticism... (1)

Dynetrekk (1607735) | more than 2 years ago | (#38314356)

The point is that by doubling your supercomputer size, you gain one day of weather forecast. Yes, that is a very small gain, in some mathematical sense; however, if you are a farmer and you're planning your harvest, that's huge. Same if you're a fisherman and want to stay out there as long as possible until the winter storm actually hits. For society, this information is expensive to obtain, but the returns on the investment are great.

Re:Scepticism... (1)

shic (309152) | more than 2 years ago | (#38396336)

Doubling your computational effort to extend your weather forecast to a 24th day might well be justified, as might doubling it again to get an extra hour. Doubling again to get the next few minutes, or again for an extra few seconds is far harder to justify - especially as other addressable factors might have greater influence on the uncertainty of the predictions.

We clearly have a different subjective take on the typical practical value of calculations at the cutting edge of 'brute-force' computation. Without specifics we are unlikely to progress the debate. There are, undoubtedly, some problems that can only be tackled by more grunt (tightly coupled computation) but - in my opinion - progressing these problems head-on, typically, does not offer benefits commensurate with the cost.

Re:Scepticism... (1)

EETech1 (1179269) | more than 2 years ago | (#38312542)

Yeah, all the land that used to be forest or grassland cleaning the air and providing space for life to live, is now covered in black asphalt soaking up heat, covered in millions of pollutant spewing vehicles filled with air-conditioned lead-footed egomaniacs radiating thousands of BTUs a minute of centuries old carbon into a smothered landscape no longer able to contain plant life and therefore clean itself. Headed from Life_Of_Consumption pounding out 60 miles each way to work at PollutionCorp LLC, sucking up megawatts of coal produced electricity into (insert_industrial_process_here) further radiating millions more BTUs of heat and millions of tons of toxic chemicals into the air...

What could possibly go wrong?

Who can possibly ignore this and imagine the planet can continue to suck it up (much less thrive), and maintain the incredibly delicate balance of life, habitat, food, and clean water that made it possible to get here in the first place while we shit all over it in the most industrial way possible on the land, in the oceans and lakes and rivers, forests hah gone, and into the very air it breathes!

This needs serious Global attention NOW because (we shoulda been doing it all along) we have a long way to go to get Earth back to a state in which it can begin to slowly repair itself before it's too far gone to ever even be a nice place to live again!

  For Fuck Sake... If you can't see this one coming, I hope you can't get out of its way when it gets here!

Cheers

The title is misleading (2, Interesting)

Anonymous Coward | more than 2 years ago | (#38302754)

The title is misleading and not really correct, because it doesn't describe the main thrust of the project. What the group at Texas is trying to do is change the way computer models make predictions, because they recognize that predicting events like Katrina or 9/11 with any kind of accuracy, based on essentially no data, is basically impossible, and that even when prediction is possible, it's still full of uncertainty.

They don't want the models to spit out a single answer (e.g. "There will be a sea level rise of 10 centimeters within 20 years"), but rather a probability distribution ("The sea level rise over the next twenty years can be modeled as a normal distribution with a mean of 10 centimeters and a standard deviation of 5 centimeters"). The distribution is supposed to be based on uncertainties that arise at various stages in the process of modeling, such as model assumptions and data collection.

Personally, I think in certain cases these techniques are great, and in other cases they are worse than useless. If you have a model that's supposed to predict terrorist attacks, it will happily tell you that with 90% confidence the probability of a terrorist attack in Location X in the next year is between 1 and 3 percent. This may be perfectly correct, but highly misleading, because fundamentally the event is not probabilistic, and the only reason it appears to be is that a key piece of data is missing. As such, what the computer should really do is say the following. "Dear DHS: I don't know whether the terrorists are planning the attack. If they are, it is very likely to occur. If they aren't, it won't happen. Please do your job and go find out whether they are or not, and let me do more interesting things. Sincerely, Computer."

Re:The title is misleading (3, Interesting)

statdr (1585151) | more than 2 years ago | (#38302834)

That doesn't make much sense. Models don't just spit out one answer. Models will report estimates and estimates of the uncertainty (typically standard errors) of the estimates. These uncertainty estimates define the probabilistic distribution from which the events being modeled derive. Of course, there can be quite complex underlying probabilistic distributions; not just the simple case of a one-dimensional distribution defined by one parameter. Computers are useful when the number of dimensions of the underlying probabilistic distribution is large and also when the form of the underlying distribution is not some convenient structure. However, while a supercomputer can handle models with complex underlying probabilistic distributions, this doesn't mean that you'll get anything useful out of the modeling exercise. You still need to have lots of data on events (and non-events) to try and predict the events with any degree of accuracy and precision.

Re:The title is misleading (1)

statdr (1585151) | more than 2 years ago | (#38302946)

In general, I'm not a big fan of exploratory modeling without a follow-up process of validation. And I'm not a fan of using the data use to build a model to then validate the model (hello climate change) One can find predictors of outcomes (e.g. oh look, the probability of a terrorist attack is higher if it's a Tuesday afternoon) that are statistically significant just because of the large amount of data available. Model-building without validation is just "correlation does not imply causation". Unfortunately, model-building is less a science and more a consensus building among professionals in a given field (appeal to authority). That's why I distrust the models used to predict temperature changes by the pro-man-made climate change scientists. In a closed-loop of like-minded individuals, who, for example, don't see any reason not to validate models with the same data used to build the models, you end up with process that produces non-challenged work.

(No subject) (2)

spirito (1552779) | more than 2 years ago | (#38303002)

This is a topic of great interest in aerodynamics. Aim is to understand how uncertainties in the input data (flow conditions, geometric imperfections, ....) affect the predicted aircraft performances. Some research has already taken place in Europe, for example see the project nodesim (http://www.nodesim.eu).

Die, please (0, Informative)

Anonymous Coward | more than 2 years ago | (#38303076)

Read this with your Texas propaganda filter turned to High.

Drop dead you bigoted sack of shit. Fucking geek filth... your delusions of superiority are just one reason why people spit on your scummy ilk.

Re:Die, please (0)

Anonymous Coward | more than 2 years ago | (#38306934)

That's an awfully big inferiority complex you have there.

Texas-sized, maybe.

Supercomputers? (0)

Anonymous Coward | more than 2 years ago | (#38304080)

Supercomputers are so 20th century. These days, it is all about distributed computing, I guess if you've got an unlimited budget it would be fine. But if you don't decompose the problem, you'll still hit the memory limits of your machine.

Data is growing faster than memory. Storage isn't the problem. The problem is how much you can load into memory at one time.

There is software out there that can handle really huge data without billions of dollars of hardware (for example Revolution Analytics version of R).

Go Texas!

I predicted (0)

Anonymous Coward | more than 2 years ago | (#38306720)

this sort of disaster (using Super computers to predict such unnatural) disasters on the HP touch pad I got in the Fire Sale.

Our supercomputers told us the location of WMD (0)

Anonymous Coward | more than 2 years ago | (#38308528)

We ran the codes and we now know without a doubt that Saddam is storing his
weapons of mass destruction in these computer determined locales.

And thanks to these computers, we now know that the masses will readily believe
that 9/11 was caused by a small group of radical muslims and that the US government
had absolutely *no* idea that anyone could ever possibly use planes as a weapon.

Like the previous poster said: Garbage In: Garbage Out.

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...