Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Edward Lorenz, Father of Chaos Theory, Dies at 90

timothy posted more than 6 years ago | from the what-are-the-odds-his-middle-name-was-norton dept.

Math 104

An anonymous reader writes "Professor Edward N. Lorenz, who discovered in 1961 that subtle changes in the initial conditions of a weather simulation program could cause very large differences in its results, died of cancer Wednesday at the age of 90. The contributions of the father of chaos theory, who coined the term 'the butterfly effect' and also discovered the Lorenz Attractor, are best summarized by the wording of the Kyoto Prize in 1991 which noted that his discovery of chaos theory 'profoundly influenced a wide range of basic sciences and brought about one of the most dramatic changes in mankind's view of nature since Sir Isaac Newton.'"

cancel ×

104 comments

Sorry! There are no comments related to the filter you selected.

Died of cancer... but why? (5, Funny)

DamienRBlack (1165691) | more than 6 years ago | (#23133580)

Why aren't they reporting that his cancer was caused by a zebra sneezing in the UK last fall under a fig tree. It seems quite relevant.

Re:Died of cancer... but why? (3, Informative)

dreamchaser (49529) | more than 6 years ago | (#23133588)

You might get modded troll or flamebait by the people who didn't understand your subtle reference. It could have been a butterfly taking off in New Mexico though. We aren't quite sure.

A great man whose contributions will be remembered for centuries to come has passed. I think I'm going to fire up a fractal generator and play with Lorenz Attractors now.

Re:Died of cancer... but why? (3, Interesting)

Anonymous Coward | more than 6 years ago | (#23133698)

And it would be warranted. The 'butterfly effect' is a horribly misleading statement. Mathematical chaos applies to only certain deterministic systems, not real life. There is no evidence that the real world is a simulation or even if it was that it falls into the narrow range of non-linear dynamics problems that exhibit mathematical chaos. Lorenz's attempt at modeling the weather certainly exhibited mathematical chaos, but the model wasn't the weather itself.

Re:Died of cancer... but why? (4, Funny)

smallfries (601545) | more than 6 years ago | (#23133744)

Besides, decades of research have improved those models to the extent that we can accurately predict the weather anywhere up to 20mins in the future.

Re:Died of cancer... but why? (3, Funny)

smitty_one_each (243267) | more than 6 years ago | (#23133872)

You omit the fact that we've also made some strides in predicting what the past should have been.

Re:Died of cancer... but why? (1)

smallfries (601545) | more than 6 years ago | (#23133962)

Very true, I believe that Blair et al. have published a lot in this area.

Re:Died of cancer... but why? (1)

Bromskloss (750445) | more than 6 years ago | (#23135098)

decades of research have improved those models to the extent that we can accurately predict the weather anywhere up to 20mins in the future.

Yeah, through sample and hold.

Re:Died of cancer... but why? (1)

maxume (22995) | more than 6 years ago | (#23133746)

You mean the Zebra sneezes and it doesn't affect the gigawatt scale prevailing wind?!

Re:Died of cancer... but why? (1)

ta bu shi da yu (687699) | more than 6 years ago | (#23133764)

But Agent Smith... you say that now but yesterday when Neo kicked your butt it was like "it's only a simulation, it's not like this is reality"...

Re:Died of cancer... but why? (0)

Anonymous Coward | more than 6 years ago | (#23133930)

Mathematical chaos applies to only certain deterministic systems, not real life.
Heisenberg's ghost would like to haunt you now.

Re:Died of cancer... but why? (0)

Anonymous Coward | more than 6 years ago | (#23133972)

Mathematical chaos applies to only certain deterministic systems, not real life.
Heisenberg's ghost would like to haunt you now.
Why? Quantum indeterminacy precludes mathematical chaos. Since real life is governed by quantum indeterminacy, real life can't have mathematical chaos.

Re:Died of cancer... but why? (1)

dreamchaser (49529) | more than 6 years ago | (#23134068)

Except it was just a reference to a man and his work, not a discussion on the validity of chaos theory as applied to dynamic systems. While you are more or less correct you do not have to be pedantic about it nor imply that the original parent was trolling.

Re:Died of cancer... but why? (1)

whit3 (318913) | more than 6 years ago | (#23135590)

There are many real physical systems where the
'butterfly effect' is very evident, and the history
of science includes prior art. Lagrange found in
his orbital calculations of planets that the sensitivity of
the solutions to errors in observation could be
extreme, about 200 years ago. A century ago,
the excessive sensitivity of matrices with large
eigenvalues provided a good model of the problem.

Today, we call it 'chaos theory', but Lorenz is just a recent
worker in the field, not really the father...

Re:Died of cancer... but why? (1)

mbkennel (97636) | more than 6 years ago | (#23135858)

The 'butterfly effect' is a horribly misleading statement. Mathematical chaos applies to only certain deterministic systems, not real life. There is no evidence that the real world is a simulation or even if it was that it falls into the narrow range of non-linear dynamics problems that exhibit mathematical chaos.
There is ample experimental evidence that chaos occurs all over physical reality.

Lorenz's attempt at modeling the weather certainly exhibited mathematical chaos, but the model wasn't the weather itself.
The model wasn't weather, obviously.

Re:Died of cancer... but why? (1)

Moderatbastard (808662) | more than 6 years ago | (#23135938)

Mathematical chaos applies to only certain deterministic systems, not real life.
Bullcrap. There are numerous real world examples in biology (populations, heart rhythm) physics (double pedulum, leaky waterwheel), economics (stock markets) and chemistry (some reaction named after an unpronouncable Russian or two).

Some might even mention the slashdot modding system, as evidenced by the fact that you got modded up which wouldn't occur in any sensible Newtonian system.

Re:Died of cancer... but why? (1)

Darby (84953) | more than 6 years ago | (#23137014)

Mathematical chaos applies to only certain deterministic systems, not real life. There is no evidence that the real world is a simulation or even if it was that it falls into the narrow range of non-linear dynamics problems that exhibit mathematical chaos.

There is no evidence that the earth is a simulation, but there are experiments demonstrating that some aspects of reality do follow the model.

Mitchell Feigenbaum's period doubling cascade into chaos started out as a mathematical curiosity, but Albert Libchaber's experiment on thermal convection in liquid helium demonstrated that that's actually the path nature takes (given the experimental set up and all, obviously) even to the point that the ratio between period doublings is the Feigenbaum number.

Overrated (4, Informative)

2.7182 (819680) | more than 6 years ago | (#23133738)

Its controversial that he was the first. A lot of people worked on this area. In fact, it is controversial that chaos will ever contribute to science in any way. The pure mathematical theory is very hard. See the work by Curt McMullen for example. Many people I know are very skeptical, and there are a lot of bad papers purporting to use chaos theory.

Re:Overrated (4, Informative)

2.7182 (819680) | more than 6 years ago | (#23133848)

OK, this is going to get some people pissed, but it is my honest opinion and I am not doing this to troll. There are a lot of scientific areas that are promoted to get peoples careers going. In fact, they are largely vaporware. Here are some examples:

1. Robotics. Most of academic robotics is pretty lame. The good people go into industry. Consider for example Michael Raibert and Big Dog. (Look on youtube.) This guy is a true genius, so he left MIT. Most robotics that you see in the media are really bad. Like Alan Alda talking to a robot that "has emotions".

2. Wavelets. First of all, was invented a long time ago. Its just another choice of basis. Not clear if they are the best for compression or denoising. Look closely and you will see that classical harmonic analysis provides a good competing answer. Jpeg2000 may be better than jpg but not clear if it is due to the use of wavelets, or because of the fact that they had like 40 people working on the lossless coding scheme, which is an ad hoc heuristic. And besides, how many of us are using jpg2000 ? Finally, people I know that work in it say "I just use the haar basis". Haar found this basis in something like 1912.

3. Chaos. By definition hard to appy to experimental science. As mention the mathematical theory is super hard. McMullen won a Fields medal for it. Work by Sullivan and Duordy is awesome, but they aren't claiming to connect it to experiments.

4. Catastrophe theory. This was the 60s and 70s version of wavelets. Hardly mentioned in the media anyone, and mostly the people who work on it are pure mathematicians.

5. Artificial intelligence. Goedel Escher Bach had our hopes up. But nothing ever happened. It' too hard. People claim breakthoughs all the time, but wheres the beef ?

6. Computer vision. A total mess. They don't even read each others papers and are busy reproducing each other's work, with tends to be some hacks that work only in limited conditions. Remember the MIT face recognition program after 9/11 that was at the Statue of Liberty ? They failed it!

Underrated (1)

bunratty (545641) | more than 6 years ago | (#23133908)

1. It was the academic robotics teams that did well in the 2005 DARPA grand challenge (the top finishers were from Stanford and Carnegie-Mellon). Note that this also incorporated your #6, computer vision.

5. AI continues to improve at a steady pace. It's not that "nothing ever happened". For example, the DARPA Grand Challenge was won and Kasparov was beaten by Deep Blue. I think you might be referring to "strong AI" which we won't get until about 2030 because computers simply won't be fast enough until then.

I think if anything, people aren't realizing how quickly technology is advancing. This is the nature of exponential growth. I'm sure people will be dismissing AI and robotic research right up to when everything changes, seemingly out of the blue to most people.

Re:Underrated (0)

Anonymous Coward | more than 6 years ago | (#23134016)

Have you worked in those areas ? I did for many years. It was depressing.

Re:Underrated (1, Offtopic)

Metasquares (555685) | more than 6 years ago | (#23134370)

What you are saying is we won't get AI until 2030 because computers won't be fast enough to brute force it until then (and that's only if you believe Kurzweil). We can do it earlier if we could come up with some clever ideas, but I don't have any doubts that the first AI is going to be an exact neuron-for-neuron reconstruction of the brain.

Re:Underrated (2, Insightful)

bunratty (545641) | more than 6 years ago | (#23134664)

I know Kurzweil makes similar claims, but I am not merely regurgitating his ideas. The computational power of the brain is massive. In order to have similar computational power in a computer, we will need to wait for many generations of improvements in processors. We will almost certainly need to move away from very powerful cores based on semiconductors timed by a clock towards smaller asynchronous processors. Even with those advancements, there's no way we'll be able to simulate a human brain in real time by 2030. I think the first strong AI will resemble a human brain about as much as the Wright flyer resembled a bird.

Re:Underrated (0)

Anonymous Coward | more than 6 years ago | (#23136068)

Learn your facts before you speak. He said that we WILL BE ABLE TO simulate the brain by 2040/50, and not only will there be an eqivelent level of computing, but a more powerful one sooner or later. Plus take into account that saps came from billions of years of evolution, computers have been evolving for a few hundred, and their already more complex than the bacteria that took millions to evolve.

Re:Underrated (1)

bunratty (545641) | more than 6 years ago | (#23136746)

Yes, we will be able to simulate the human brain by 2040 to 2050. We will not be able to do so by 2030, as I said. We will have the a computer equivalent in computational power to a human brain by around 2030. The 10-20 year lag is due to the overhead of emulating hardware that is quite different than a computer. That's why I say the first strong AI will not simulate a human brain, but rather do what a human brain does in different, more computer-like ways.

Re:Underrated (1)

DeadDecoy (877617) | more than 6 years ago | (#23136092)

I don't think it's the brain's computational power that's slowing us up so much as a lack of understanding how the brain works. I doubt that the full chemical pathways that occur when someone experiences something (and has that information stored and processed) has been described anywhere. Sure we know what some of the general components are for long/short term memory and cognitive thinking, but we're still a long way off from understanding how that stuff works. It would be like understanding how a everything in a computer works without having the theory or prior experience to back it up. Sure, that thing looks like a hard drive and stores data, but how does it do that (assuming you've never seen a hd before)? In fact, I'm sure that if we could characterize the human brain patterns, there would probably be hundreds of grant requests for super computers, or a lot of parallel processors to do the work.

Re:Underrated (1)

bunratty (545641) | more than 6 years ago | (#23136480)

No, it is the computational power. Last year, one-half of a mouse brain was simulated at one-tenth real-time [livescience.com] . Can you imagine how slowly a full human brain would run in 2008? It would take years just to see what happens upon it getting basic sensory stimulus, far too slowly to get useful research out of it.

Re:Overrated (1)

SkyDude (919251) | more than 6 years ago | (#23133916)

4. Catastrophe theory. This was the 60s and 70s version of wavelets. Hardly mentioned in the media anyone, and mostly the people who work on it are pure mathematicians.


This would explain the onslaught of mega-disaster movies in the 70s.

5. Artificial intelligence. Goedel Escher Bach had our hopes up. But nothing ever happened. It's too hard. People claim breakthroughs all the time, but wheres the beef ?

This may be miles above my head, but how could AI ever exist? OK, I can be sold on the idea that there's a clever algorithm at work in AI, but, just like a human mind, choices would be limited to what the person has learned about. Doesn't that limitation apply to a computer's programming? If so, where's the intelligence?

Re:Overrated (0)

Anonymous Coward | more than 6 years ago | (#23134026)

This may be miles above my head, but how could AI ever exist? OK, I can be sold on the idea that there's a clever algorithm at work in AI, but, just like a human mind, choices would be limited to what the person has learned about. Doesn't that limitation apply to a computer's programming? If so, where's the intelligence?
So you disagree that the mind is a physical system?

Re:Overrated (1)

Unfocused (723787) | more than 6 years ago | (#23136950)

Some would argue that the minds of certain others aren't intelligent systems.

Re:Overrated (4, Insightful)

Xest (935314) | more than 6 years ago | (#23134784)

The first step in producing an intelligent system is creating something that can constantly take in inputs and react to them in some intelligent way.

People overlook how important Turing's original successes in producing the earlier computers were towards this goal, the fact that he was able to create a machine that was continuously able to react to inputs and respond to them in a much more dynamic way than mechanical systems is a good first step, the fact we even have computers is a major hurdle out the way in producing intelligent systems.

AI suffers in that the more we understand about intelligence the less we actually attribute to intelligence. Intelligence is too often treated as some mystical thing that is unexplainable and just is, but the fact is at the end of the day it does just come down to sets of processes and knowledge - albeit extremely complex ones! The problem is in how do we produce something capable of performing processes on par with a human brain when the human brain is a massively powerful system that we just don't have the technology to create artificially on that scale yet.

Of course there's also the question of defining intelligence in the first place, different people explain intelligence in different ways. Many people even redefine their understanding of intelligence many times in a single day, you may have person x deciding one person is stupid and unintelligent one minute because they failed a simple English exam yet they may decide their dog is intelligent the next because it lifted it's paw for that person when given a command to do so. It is the moving goalposts of what intelligence is that are often why intelligence often gets treated with such contempt as using the above example we may create a robot that could equally lift a robotic paw on being given the same voice command as a real dog, yet when the robot does it it's no longer classed as intelligent. It's hard if not impossible right now to create a system that would be capable of passing the English exam, but you can guarantee as soon as we could it would no longer be seen as an intelligent task due to the very fact that it had been handed on to a machine to perform.

AI isn't impossible by any measure, we just have to have realistic expectations for it and realise when we've create a machine that has actually peformed an intelligent task. There is never going to be a mystical machine that's seen as being an amazing AI robot because it can walk like us, talk like us and act like us simply because by the time we are able to produce such a machine we will understand it well enough that the mysticism has gone and it's just another machine performing some task that we now understand that we can produce machines to perform.

Re:Over-stated (1)

noshellswill (598066) | more than 6 years ago | (#23136816)

Byteboyz fool. Computer = mechanical system. Electrons and holes. Volts and amps. Watts. What part of the word "mechanism" do you not understand ? Algorythm, program, data, information ... exist SOLELY in the mind(s) of (human) coders & builders.

Re:Overrated (1)

Nefarious Wheel (628136) | more than 6 years ago | (#23137562)

It's hard to quantify awareness, and until we can do that we have nothing to measure, and therefore no science. Best to stick with applications, at least that way we'll make steady advancement and perhaps approach the goal.

Remember it's mostly a matter of interpretation, of semantics. To ask whether a machine can think is like asking whether a submarine can swim.

Re:Overrated (2, Informative)

protobion (870000) | more than 6 years ago | (#23134120)

OK, this is going to get some people pissed, but it is my honest opinion and I am not doing this to troll. There are a lot of scientific areas that are promoted to get peoples careers going. In fact, they are largely vaporware. Here are some examples:

1. Robotics. Most of academic robotics is pretty lame. The good people go into industry. Consider for example Michael Raibert and Big Dog. (Look on youtube.) This guy is a true genius, so he left MIT. Most robotics that you see in the media are really bad. Like Alan Alda talking to a robot that "has emotions".

2. Wavelets. First of all, was invented a long time ago. Its just another choice of basis. Not clear if they are the best for compression or denoising. Look closely and you will see that classical harmonic analysis provides a good competing answer. Jpeg2000 may be better than jpg but not clear if it is due to the use of wavelets, or because of the fact that they had like 40 people working on the lossless coding scheme, which is an ad hoc heuristic. And besides, how many of us are using jpg2000 ? Finally, people I know that work in it say "I just use the haar basis". Haar found this basis in something like 1912.

3. Chaos. By definition hard to appy to experimental science. As mention the mathematical theory is super hard. McMullen won a Fields medal for it. Work by Sullivan and Duordy is awesome, but they aren't claiming to connect it to experiments.

4. Catastrophe theory. This was the 60s and 70s version of wavelets. Hardly mentioned in the media anyone, and mostly the people who work on it are pure mathematicians.

5. Artificial intelligence. Goedel Escher Bach had our hopes up. But nothing ever happened. It' too hard. People claim breakthoughs all the time, but wheres the beef ?

6. Computer vision. A total mess. They don't even read each others papers and are busy reproducing each other's work, with tends to be some hacks that work only in limited conditions. Remember the MIT face recognition program after 9/11 that was at the Statue of Liberty ? They failed it!
And I trust you speak of this from your expertise in every one of those fields ?

I'll talk about only what I know. In biology, from eco-systems to cellular processes, one sees a lot of non-linear dynamics, many of those appearing to conform to chaos theory. Chaos theory is beleived to be the closest thing we have to explaining those phenomenon. And yes, it is challenging.

the thing about mathematical theories (1)

rootpassbird (1276000) | more than 6 years ago | (#23134520)

and constructs is that you don't think they are useful for decades or even centuries, and then, one fine day, the next big thing of the day is based on one of these "languishing, useless" constructs, combined with somthing else that is recently (at the time) discovered or invented.
These theories _last_.
They wait patiently for society to evolve to the point of using them in practice.
Disclaimer 1: IANAM (mahematician).
Disclaimer 2: does not apply to all works.

Huh? (1, Offtopic)

Xest (935314) | more than 6 years ago | (#23134660)

"5. Artificial intelligence. Goedel Escher Bach had our hopes up. But nothing ever happened. It' too hard. People claim breakthoughs all the time, but wheres the beef ?"



You sound like you're a hook line and sinker victim of the AI effect [wikipedia.org] . I'm no expert in the other areas you mention, but in terms of AI you truly don't seem to understand the subject at all. There are plenty of examples out there of fields where AI has been extremely successful and are used on a daily basis - data mining, medical diagnosis, spam filtering to name a few examples. I can only guess that you're living under some false expectation of strong AI anytime soon, whilst the possibilities of AI were overhyped for a long time, it's now also nearly been equally long accepted that we simply don't have the understanding or the computing power to produce strong AI quite just yet.

Re:Huh? (0)

Anonymous Coward | more than 6 years ago | (#23137636)

data mining, medical diagnosis, spam filtering
Well, one could argue that these are basically more or less novel statistical methods.
Do they have taught us any insights into the phenomena of intelligence ?

Re:Huh? (0)

Anonymous Coward | more than 6 years ago | (#23138294)

More like "The AI effect is a construction of desperate AI people".

Re:Overrated (1)

genmax (990012) | more than 6 years ago | (#23134840)

Wavelets - yes it is "just" a choice of basis, but a good one for images because it admits a sparse representations of image features. Sharp edges which essentially are very high frequency and would take a lot of significant Fourier coefficients to reconstruct are much more compactly explained in the wavelet basis. Sparse representation implies better compression and higher signal to noise ratios for de-noising.

I'm sorry, but your high-handed dismissal of all of these research areas (Computer Vision - heard of Microsoft's photosynth, Honda Labs' recent work towards automated pedestrian detection, image search algorithms at University of Cambridge, etc) with such sweeping one line critiques is quite obnoxious - so yes, you were right - people are going to be pissed.

Re:Overrated (0)

Anonymous Coward | more than 6 years ago | (#23135026)

Ah yes, you sound like one of the brainwashed wavelets people. It's like having a conversation with a creationist. You can't convince me that wavelets are good because they SOUND like a good idea. Can you answer the questions raised above about jpg2000 ? Can you point to hard numbers and comparisons of wavelets with other methods ? The burden is on you, as an advocate of wavelets, since there don't seem to be any compelling examples that are so much better than other methods.

Re:Overrated (2, Insightful)

genmax (990012) | more than 6 years ago | (#23135926)

Err ... no - that's what all the wavelet based compression and denoising papers _are_ about. Comparisons in terms of number of significant wavelet coefficients required, compression ratios, mean square error in denoised images, etc. There is a reason the JPEG committee put wavelets into the jpeg2000 standard.

There were no 'questions raised' about jpg2000 - only the poster's impression that they were not a good idea. Wavelets and their use to image processing on the other hand has gone through a very thorough peer review process, and the comparisons are there for everyone to see in these journals.

And nice job with comparison to creationists - may I point out that it is you (and the original poster) who is challenging accepted wisdom in science and research communities - accepted after a thorough peer-review process that is - without any tangible arguements.

Re:Overrated (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23134932)

By using the word 'vaporware' you are indicating that you fail to understand basic research. Basic research is about knowledge, and not about products, and serve as a basis for many real day applications, or forms paths to new knowledge. In order for basic research to work, the researcher need to have a goal, a focus point that might be researchable in the far future, to work too. This 'dream' keeps them focussed and interested in an otherwise boring topic.

I don't think many people didn't see the use of Gauss' stuff back then, but try to imagine a world without all that mathematical knowledge.

Re:Overrated (0)

Anonymous Coward | more than 6 years ago | (#23136094)

The investigation into chaos (if you read lorenz's book you'd know) is looking at the overall behavior of non-linear dynamical systems which are characterized most prominently by sensitivity to initial conditions. You may find it a trifle now to enter in some equations into your computer and let it run to show you the results, but from the era lorenz started in, this was entirely new. People had no way of estimating how fluid turbulence would occur other than, "unless you model each point/atom, your model is inaccurate." What is really amazing is how much these people have brought chaos theory to public attention by pointing to your simple pendulum and noting how it is, in fact, chaotic under driving and damping forces. These are the baby steps needed before we are able to more fully understand complicated systems, and to scoff at them like they don't apply is to laugh at the foundation for not doing much besides supporting the building.

Re:Overrated (1)

Chutulu (982382) | more than 6 years ago | (#23136954)

Wavelets has lots of applications in segmentation methods as well as Markov Fields for example. They are good at finding differences in images textures. Computer vision is still a field that needs a lot of development. It's very difficult to find proper methods to solve a problem. And yes when it works it only does in very special conditions. The same can be said about speech and text recognition. But don't you think it is worth all the work people have made? By making small steps we can reach the goal we all desire, no?

Re:Overrated (1)

fm6 (162816) | more than 6 years ago | (#23137332)

Basically, you're complaining that all the ideas you mention are overhyped and are dominated by inept hacks.

You're absolutely right. But so what? It doesn't detract from the achievements of the brilliant people who founded these fields. Let's stop and appreciate Edward Lorenz's achievements, and save Fixing Academic Science for another day.

Re:Overrated (0)

Anonymous Coward | more than 6 years ago | (#23137370)

0. Grid Computing

How can you miss that? In addition to being vaporware, it has earned the notorious distinction of consuming millions and millions of research funding $$s, and all it has produced is a slide full of icons, cool keywords, and acronyms.

Re:Overrated (2, Informative)

jmichaelg (148257) | more than 6 years ago | (#23134688)

it is controversial that chaos will ever contribute to science in any way.


I agree that a lot of chaos work produced not much more than chaos. But sometimes a paper can tell you what results to discard out of hand and that in itself is a contribution. From his seminal 1963 paper [allenpress.com] ,

When our results concerning the instability of nonperiodic flow are applied to the atmosphere, which is ostensibly non-periodic, they indicate that that prediction of the distant sufficiently distant future is impossible by any method, unless the present conditions are known exactly.

 

Not "unless" but "even if" (1)

crovira (10242) | more than 6 years ago | (#23136148)

Chaos is a process or descending down paths probabilities, not a result.

Quantum theory and the indeterminacy of states means that the actual state of a system has to be known, it cannot be calculated.

Re:Overrated (1)

Schemat1c (464768) | more than 6 years ago | (#23135130)

Its controversial that he was the first. A lot of people worked on this area. In fact, it is controversial that chaos will ever contribute to science in any way.
Yes, the whole history of this idea is very... chaotic?

*ducks*

Re:Overrated (0)

Anonymous Coward | more than 6 years ago | (#23136014)

You are an idiot. As a Graduate Student in Physics with a Bachelors in Math/Physics: We are using the theory both in Physics and in Math, not to mention Chaos Theory is used in Cognitive Neuroscience now as well, in the field of artificial neural networks.
But I do have to say that his research was no where near as important or vast as Newton's. Compared to Isaac he was a nobody.

Re:Overrated (2, Interesting)

DynaSoar (714234) | more than 6 years ago | (#23136592)

> ... it is controversial that chaos will ever
> contribute to science in any way. ... there are
> a lot of bad papers purporting to use chaos
> theory.

From my own field:

Supporting your assertion -- A characterization of chaos is measuring the fractional (ie. 'fractal') dimensionality of the phenomenon. Someone estimated the dimension of the human cortex, with its convolutions embedded within convolutions. They plugged in numbers and got a result. But what's the point? What good does it do? There's no theory of even speculation as to what this might mean. It's doing 'sexy' but irrelevant science.

Contrary to your assertion -- EEG shows signs of being chaotic (self-similarity at different scales, dependence on initial conditions). Characterizing the dynamics (changing dimensionality) of the bioelectric field at different scales was compared with the same measure of synchronous firing of neural assemblies. The result significantly supported the hypothesis that EEG at different scales represented changes in neural synchrony/desynchrony at different scales. The synchrony concept is trivial, and well supported by electrophysiological measurement. This application showed that characterizing the signal as chaotic (and so too the deterministic phenomenon producing it) had some validity. The utility of the study applies to use of transcranial magnetic stimulation to force widespread synchronus firing, reducing the dimensionality, and forcing a state also having low D, that of people with depression taking antidepressants and showing benefit from them. It supports the hypothesis that TMS can treat depression. And it now does.

With respect to my field, you're mostly correct in your assertion. Most applications are quite obviously being done by people with little understanding of the concepts, and/or applied to things with little regard as to whether it makes sense to do so.

> Its controversial that [Lorenz] was the first.

Quite so. He did, however, investigate the nature of what he was seeing in a very creditable way. After noting the sensitivity to initial conditions, he reran one of his simulations from the middle, expecting it to finish out the run producing the same data. It didn't. And he took that to be a significant (in the practical sense) result, despite it being contrary to his expectations. Many researchers would bottom-drawer a dataset that contradicted their hypotheses. He recognized the probability that it would prove important. That takes some courage as a researcher, and he deserves credit for that at least.

Re:Died of cancer... but why? (2, Funny)

CastrTroy (595695) | more than 6 years ago | (#23134004)

Re:Died of cancer... but why? (1)

anilg (961244) | more than 6 years ago | (#23138240)

Hehe.. yeah. Too bad he didn't patent his theory, he could have collected large royalties from MS over Vista :)

Re:Died of cancer... but why? (3, Funny)

Digestromath (1190577) | more than 6 years ago | (#23133610)

I think the cancer angle is all wrong for a headline.

The sensationalist news should report "Butterfly goes on rampage, slays notable mathematician. Police seek public's help in finding strange attractor."

Or is it really just too soon for jokes? How do we as a community honour someone who has made such great contributions. Contributions no doubt to be misremembered by pop culture (thanks alot Hollywood).

Perhaps he would be fondly remembered though an internet meme.

Re:Died of cancer... but why? (1)

QuantumG (50515) | more than 6 years ago | (#23133648)

was he that guy in Jurassic Park?

Re:Died of cancer... but why? (4, Funny)

JustOK (667959) | more than 6 years ago | (#23133664)

I'm thinking that there probably shouldn't be a particular pattern to when the jokes are made.

Re:Died of cancer... but why? (2, Interesting)

witherstaff (713820) | more than 6 years ago | (#23133674)

As humor is often just a funny way of being serious, I don't think it's too soon for jokes.

Perhaps we should all use one of the many chaos generators out there for a few minutes as a way to salute his work? What is the best chaos program out there nowadays - I haven't used one since fractint ages ago. Well I did install the electric sheep screen saver [electricsheep.org] which is very nice eye candy, but I want to see some particles circling around a lorenz attractor.

Re:Died of cancer... but why? (2, Funny)

JustOK (667959) | more than 6 years ago | (#23133960)

I've been using Vista. That's been pretty chaotic.

Re:Died of cancer... but why? (0)

Anonymous Coward | more than 6 years ago | (#23133874)

Not as relevant as how upset people are by this... there's chaos in the streets I tell you!

Furthermore... (1)

thegnu (557446) | more than 6 years ago | (#23137064)

...if he doesn't die next year at the same time, we can finally be sure he was on to something. :)

A hugely important concept... (4, Interesting)

26199 (577806) | more than 6 years ago | (#23133582)

...and also one that's fun to play with [exploratorium.edu] (needs java).

Re:A hugely important concept... (1)

Tastecicles (1153671) | more than 6 years ago | (#23134360)

I'll say. [triumf.ca]
In case you missed, I'll say it again. [wikipedia.org]
There are several programs out there, most are freeware, a few are open source, but the one that sticks out in my mind and easily the best and most powerful of the bunch is FractINT. For the past fifteen years I've been playing with this gem, and I'm still finding new stuff I can do on it. Some of it isn't even covered in the 580-page manual.

Re:A hugely important concept... (1)

kramulous (977841) | more than 6 years ago | (#23137118)

Woohoo! When I click on the canvas I only get a straight line (vertical, drawing towards the top). What's so special about that?

Yes, I'll go now.

Twofo Live! (-1, Troll)

Anonymous Coward | more than 6 years ago | (#23133606)

Twofo Live! [twofo.co.uk]

Late, late, late .... (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23133650)

How come this news gets on ./ so late ??

Re:Late, late, late .... (-1, Troll)

Anonymous Coward | more than 6 years ago | (#23133708)

because you didn't submit it earlier you buttwipe!

now go back to digg or reddit or whatever

Studied it for over 15 years (1)

nighty5 (615965) | more than 6 years ago | (#23133720)

I don't like toot my own horn but I've studied chaos theory and made some significant findings over the years.

My best work has been realised over a night of heavy intoxication especially between 18 and 23 years of age. This work requires a lot of effort and is usually conducted on Friday and Saturday nights. I can't believe just how many gifted mathematicians there is over these nights. So much research, so many beers.

However these days I'm a bit more relaxed and allow the the younger crowds to take over.

Re:Studied it for over 15 years (1)

smallfries (601545) | more than 6 years ago | (#23133768)

I believe that I may have worked with some of your collaborators

Re:Studied it for over 15 years (1)

ericvids (227598) | more than 6 years ago | (#23135550)

> I don't like toot my own horn

I don't like toot either, but he's YOUR horn so I couldn't care less. /chaos ensues

(It's funny. Laugh.)

He got his butterflies wrong (1)

Hasmanean (814562) | more than 6 years ago | (#23133724)

He got his geography wrong. Butterflies in Brazil do not lead to hurricanes in Texas. As you can see on this [wikipedia.org] graphic on this page [wikipedia.org] , there is practically no hurricane activity in the South Atlantic.

Most hurricanes that would hit Texas all originate as storms over West Africa.

I wonder why Lorenz didn't use Africa in the title of his paper instead of Brazil.

Re:He got his butterflies wrong (1, Funny)

Anonymous Coward | more than 6 years ago | (#23133782)

whoosh... that's the sound of a butterfly flying completely over your head...

Why Lorenz's work was important (5, Interesting)

wickerprints (1094741) | more than 6 years ago | (#23133758)

Back in my college days, I visited the library and looked up Lorenz's paper, "Deterministic Nonperiodic Flow." On the face of it, the presentation was not particularly striking, nor did it seem significant on a superficial reading. That it was buried in a meteorology journal, rather than a mathematics or physics journal, only further obscured its importance.

Lorenz's discovery was not so much about the specific nonlinear differential system (now named after him) that he discussed in the paper, nor was it about chaos theory as we now know it. The significance lay entirely in the notion that even simple dynamical systems can display sensitive dependence on initial conditions, and that when extrapolated to real-world phenomena, the intrinsic complexity of their behavior was all but inevitable.

A chaotic system is not merely disordered, or random. There is an underlying structure. Call it a kind of orderly disorder. Prior to (and indeed, for some time after) Lorenz's work, physicists largely dismissed this possibility as absurd. We can, in such a system, model its state at some infinitesimal time t+dt after some given state at time t. We can do this quite accurately. But as Lorenz showed, the deterministic property is insufficient to imply that one can know the state of the system at any arbitrary time in the future. There is a difference between knowing how the future is calculated from the past, versus knowing what the future will actually be.

Hence the chosen title. "Deterministic" = future states are well-defined from a known prior state. "Nonperiodic" = does not display cyclical behavior. "Flow" = fluid dynamics, in Lorenz's case, atmospheric convection.

He is truly missed.

"flow "- not form fluid dynamics (4, Informative)

S3D (745318) | more than 6 years ago | (#23134030)

Flow in the title of Lorentz paper is not a flow from fluid dynamics or physics. It's a purely mathematical term which mean a solution of differential equation (Lorentz equation in the case). In more general sense flow is a group action of R on the manifold [wikipedia.org] - that is solution of the differential equation on the curved surface. It's studied by specific branches of mathematics - Differential (topological) dynamics [wikipedia.org] , which in big parts owes its origination to the Lorentz paper. So the title of the paper really mean "Deterministic Nonperiodic Solutions"

Re:Why Lorenz's work was important (1)

jlcooke (50413) | more than 6 years ago | (#23134882)

Pointcare knew of such systems when he worked on the n-body problem. Thing was - he died before Lorenz was ever in school. So Lorenz is given title of "father of chaos" when it's not appropriate to do so, imho.

RIP (0)

Anonymous Coward | more than 6 years ago | (#23133790)

I guess entropy got the best of him...

wya to rock thaty old news (0)

Anonymous Coward | more than 6 years ago | (#23133884)

WAY TO ROCK THAT OLD NEWS!

Ray Bradbury story (1952) (4, Interesting)

Fallen Andy (795676) | more than 6 years ago | (#23133888)

Time travelling hunter strays off path and kills a butterfly ... A Sound of Thunder [wikipedia.org]

Andy

Chaos theory in 3 words (0)

Anonymous Coward | more than 6 years ago | (#23133892)

Errors add up.

The butterfly effect is way too slow... (1)

howardd21 (1001567) | more than 6 years ago | (#23133906)

Evidently dying on Weds does not result in a Slashdot post until Sunday? That is one slow effect...

Or did he? (1)

Snaller (147050) | more than 6 years ago | (#23133924)

Its a leaf in the forest.

Requiescat in pace (0)

Anonymous Coward | more than 6 years ago | (#23134104)

Rest in peace, Edward Lorenz. The world will be a little more chaotic without you. Or maybe less. I can't predict which.

Now I can say (1)

kampangptlk (1252914) | more than 6 years ago | (#23134188)

his death will cause the universe to explode. wow.

Father of chaos theory? (1)

migloo (671559) | more than 6 years ago | (#23134214)

I do not want to belittle Lorenz' major contribution to chaos theory, but the concept (if not the word) of chaos had long been fairly well grasped by Poincaré (1890) and Hadamard (1898).

Re:Father of chaos theory? (1, Informative)

Anonymous Coward | more than 6 years ago | (#23134544)

No, they discovered chaotic systems but did not identify chaos theory. Their discoveries are analogous to Kepler's equations of planetary motion versus Newton's formal theory of gravity. Or Faraday's studies of the electric and magnetic fields versus Maxwell's formal electrodynamic theory. Hadamard, Lyapunov, and Poincaré made great contributions, but they did not found chaos theory. They observed individual problems but failed to piece it together in the more general mathematical theory. I don't fault any of them. Nonlinear dynamics was only in its infancy during their lives.

Re:Father of chaos theory? (0)

Anonymous Coward | more than 6 years ago | (#23135824)


Ed Lorenz's insight was not mathematics, but physical understanding and explanation.

There is plenty of obscure mathematics irrelevant to the real world, and besides Lorenz surely didn't know about exotica of Poincare either. And some Poincare's work was regarded, in its own day, as not sufficiently rigorous folderol, out of mainstream mathematics.

The point was not that chaos was a pathological mathematics object, but that it was real and common and critical in real world physics and models used by physics, and that common explanations (at the time) for complexity---pictures of broad band mode excitation in a linearish model---were not necessarily right.

Chaos theory father dies of cancer (1)

ChrisMaple (607946) | more than 6 years ago | (#23134292)

Doesn't anybody see the irony here?

Re:Chaos theory father dies of cancer (1)

crossmr (957846) | more than 6 years ago | (#23135138)

er.. no?

You might be thinking of coincidence... or really something else entirely... lots of people die of lots of things..

An experiment (4, Informative)

Coppit (2441) | more than 6 years ago | (#23134332)

Here's a simple chaos experiment you can do at home... Turn on a faucet slightly so that it drips regularly. Then increment the flow slightly, and pretty soon the drips will come out in a non-regular way. Understanding the transition from regular to irregular is part of what chaos theory is about.

Re:An experiment (0)

Anonymous Coward | more than 6 years ago | (#23134968)

Keep the process on until the water overflows the sink and makes a chaos on your carpet.

But but but! (0)

Anonymous Coward | more than 6 years ago | (#23134750)

But i invented the Chaos Theory, if you don't belive me come have a look at my room!

(Chapta: Nonsense)

Fractint (2, Informative)

Alarindris (1253418) | more than 6 years ago | (#23134898)

I saw a program on the Mandelbrot set and Lorenz attractor on PBS in the late 80's. Completely changed the way I thought about the world. Also where I discovered Fractint.

http://www.fractint.org/ [fractint.org]

Hmm (1)

rakzor (1198165) | more than 6 years ago | (#23135032)

Famous scientists around their 90's are dieing lately.

Contribution to relativistic physics and limericks (1)

mpoulton (689851) | more than 6 years ago | (#23135200)

There once was a man named Fisk whose thrust of the sword was so brisk that with the speed of the action the Lorentz-Fitzgerald Contraction reduced his rapier to a disk!

oh? (1)

thedrx (1139811) | more than 6 years ago | (#23135208)

What is your Erdos-Bacon number?

his death ... (1)

FIT_Entry1 (468985) | more than 6 years ago | (#23135538)

... was ruled to be completely random.

So, farewell, Ed Lorenz (1)

E.J.Thribb (910683) | more than 6 years ago | (#23136072)

So.

Farewell then, Ed Lorenz.

You have
in a way
reached a steady state,
that is to say
a point attractor
with cycle zero.

Now you can say hello to Shrödinger.
And stroke his cat.
If it's dead, that is.

Obligatory Addition to your Gleick Bookshelf (2, Interesting)

iluvcapra (782887) | more than 6 years ago | (#23136124)

Chaos: The Making of a New Science [amazon.com] . Tells the entire story of Lorentz's discovery, in gory detail, down to the fact that he used a Royal McBee computer to do his original weather simulation, the same computer in the famous hacker "Story of Mel".

He discovered the Lorenz Attractor? (1)

Rui del-Negro (531098) | more than 6 years ago | (#23136182)

Whoa, talk about a coincidence.

A butterfly, or a thousand 777s? (1)

starglider29a (719559) | more than 6 years ago | (#23136500)

If a butterfly can cause a hurricane, what can an entire air corridor of passenger liners do?
Did we notice a change in the weather in the days after 9/11 when the planes were grounded?
Has the last few decades of jet air travel caused the weather system to adapt such that reducing the number of flights (like 800 jets grounded for safety inspections) have a greater effect than leaving them flying?
Could the rapid swings in weather, (higher highs/lower lows) be caused by the aircraft Giga-Butterfly Effect (aka the Mothra Effect) more than a climate warming effect?
What effect would a nuclear detonation have?
How much effect does war in Iraq have?

Maybe the Chinese should ask for a moratorium on war in Iraq during the Olympics... or whatever the Butterfly Interval is.

How disappointing... (1)

clichescreenname (1220316) | more than 6 years ago | (#23136658)

I mean, seriously... cancer? At age 90? How predictable...

I just wish he could have had a "Final Destination" style death... [youtube.com]

in games (1)

rastoboy29 (807168) | more than 6 years ago | (#23139202)

I got a first class example of this in my game, where I have dumb bots following simple paths with nodes.  That's all they do, follow paths.  But in the engine they are subject to the vagueries of the physics and sim system.  I have long noticed how even the slightest change in initial orientation or inertia can have wildly significant effects on the precise paths the bots take, or whether they can complete them at all.  The system, in fact, is sufficiently complex, that even background processes on the game server can effect the timing of some moves, and thus the paths.

Needless to say, debugging could be unpleasant.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>