We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!
Two parts to "intelligence" by human standards
There's really two things to consider in producing another creature like humans that visibly demonstrate their intelligence by manipluting their environment to suit them and assert some mastery (domestication) of lesser animals.
One of course, is actually having the brains and a complex means of communicating ideas besides "bad", "good", "i like that" and "i fear that". The other is the means of manipulating the environment, and tools. And we still don't clearly understand how this happened to us to be able to properly predict how another spinoff creature might reach the same point. We see signs of growing tool use in other primates now, but much of that is a result of them watching us do it ("monkey see monkey do"), and as such can't be relied on to make the type of predictions the show was aiming for.
Certainly ant and bee colonies are a clear demonstration of how an individual in a group can be an idiot and yet through signals sent by messenger individuals (utterly unaware of what they're carrying) the colony can show massive awareness and adaptation to situations in its environment. In other words, the colony shows signs of consciousness and individuality even as the individual insect shows none. The special decided to look at the extensions of such colonies in jellyfish and spiders.
I think another fault might have been ignoring the fact that our metal and buildings and trash would still be in some parts of the environment...it would have been nifty to see what might have evolved as a result of the situation of an abandoned city falling apart.
"It's not like all animals will grow really big if nothing killed them." -- not all, but some will, and if sticking your neck out gets you food others can't get, then long necks will likely evolve. The problem is that most creatures today eat grass (no need for a long neck unless the body is too tall like elk and moose) or climb up into the trees they eat from. For a turtle to evolve a long neck, it would have had to have found some reason to not have evolved grass-eating which is nature's current direction of things since grass is so prevelant (even giraffes can deal with grasses, though they prefer leaves).
"why would sharks suddenly have a need for bioluminescence" -- perhaps a lack of food on the surface of the oceans due to excessive heat or cold? Although some creatures (like cats) have been steadily gaining some degree of infrared heat-sensitivity to deal with the dark...
"remember how raptors hunt in packs and use logic" -- I still consider that speculation at this point...believable, but not enough to base an argument on...particularly not after having based a movie on it
"Also squid have existed for millions of years and have remained pretty much unchanged" -- I would agree with this assesment. some creatures have definitely shown an end-of-the-road aspect of their evolutionary existance (as in, they're unchanged from pre-dinosaur times) and I don't see nature trying again with the invertibrates as long as there are vertibrates who can eat them.
on the show's prediction of the elimination of mammals:
yes, the big cats are disappearing, mostly due to us hunting them or taking away their own hunting and breeding grounds, but the small cats, particularly the domestic cats, are tremendously well adapted hunters of small insects and rodents, and given that most mass-extinctions tend to take out the larger animals, I can't see the food supply of cats ever disappearing...and small cats aren't hunted by anybody (you don't get much eating a meat-eater, and nature knows that).
Right now, we're the biggest killer of cats (roadkill) and the biggest danger to their extinction (eventually we might get stupid and "fix" the last domestic cat and poof! no more cats).
There's one way they can't stop you, though there's a minor loss in quality (give or take...most heavy-compression audio formats already are lossy, so it really can't get any worse): run it through an analog mode on the way into a computer. Music won't really be "CD-quality", but for spoken word, who's going to notice. And this works fine for me when I'm making (personal use only) CDs from DVD and VHS concerts (that I own -- no piracy here you RIAA jackasses).
Basically, what you do, requiring either 2 computers, or an intervening high quality tape recorder or the phillips cd-recorder thingy, is wire one computer's "speaker" jack (not the headphone jack, as that's amplified and will distort) into the input (again, not the amplified "mic" jack) of another computer, play with the volume and input settings 'til you don't see red, then with appropriate software (n-track studio works for me on my win98 laptop @ home) record it into
If you don't have 2 computers handy, you can use a tape recorder with Dolby-B to have a temp copy that you play back into the computer, or a digital tape recorder will suffice even better, but again, using the analog i/o lines instead of trying an all-digital solution. Keep in mind that an audio cd holds 650-700 (74min or 80min) plus of space, so you'll need a lot of free disc space, on a single partition, to make the
Mind you, with any CD already cut, you can copy that independently, so the whole "you can only make one cd out of this" is a stupid and useless. But I read and noticed you already noted that flaw in their logic...
It really comes down to this: any one audio package by itself isn't enough to do all the audio you want to do, in order to stay DRM compliant, but the right combination of tools, and the willingness to pass through an analog mode on the way, will allow you to do anything.
Large Animals, Little Food, happened before...
The indricothere, of the Rhino family, managed to survive for a few cool million eating from rare trees in scrubbrush land, dealing with wet and dry seasons much like the current african savannah (only without the grass).
However, there are enough grass-eaters out there that I don't think evolution's going to produce new tree-eaters except ones that eat fruit as part of a omnivorous diet (like primates today).
Nature's produced grass-eaters (including ourselves, though we can only deal with cooking the seeds -- wheat and barley are grasses) in quite a few mammal and lizard & salamander families, and I don't see the grass-eaters giving up and tree leaves being the core of the future diet again. Tree leaves are a luxury item, like giving carrots and/or radishes to a rabbit, but more mammals still need a steady regimen of grains.
A comment I just left in the "Future Is Wild" boards:
That Darwin's theory explains why things are the way they are, with regards to survival, it doesn't explain the HOW, which is mutation. Mutations occur and natural selection drives the duplication of the mutated genes 'til a new species is differentiated from the old.
However, the nature of how mutations really happen, and how "good" ones that are "prefered" arrive (as we're very keyed in to hating anything "different" ourselves and often shun it in humans or kill it in animals) is what we as humans have not been able to truly see or test. Its hard to test, as mammals have too long a breeding period, and colonial insects (ants and bees) are usually dominated by the queen's genes. Most genes that change behaviours tended to have already been on the planet somewhere, and are only spreading now because we're accidentally spreading them (e.g., "africanized/killer" bees).
The show did a good job of suggesting what natural selection might do, given a set of mutations over X million years to produce said animals, but the fact is that the mutations themselves are what's utterly unpredictable...and truth be told, rather boring by comparison to the end-results we saw.
I consider evolution a fact, but not a law in the Newton/Einstein sense, because evolution can't be used to predict the future with any accuracy since evolution doesn't explain mutations; it only relies on them. It would be like trying to use Einstein to predict something in electrons without the use of calculus.
The series seems to imply that organized societal creatures (jellyfish, spiders, like ants today) will be the big survivors and that individually-thinking creatures won't make it, and this desire to limit the effect and adaptability of the individual intelligence I feel is misdirected.
Sorry, guys, but that's an easy one. As a result of the new alliance/union of the Gungans and the Naboo, the Naboo offered one of their seats to the Gungans. The Gungans voted for Jar Jar. Unanimously. To get him off the damn planet.
Actually, the language itself isn't the greatest. It won over c++ developers because of 3 reasons : a standard network library, closely tied to a (reasonably) well-designed, truly OO I/O library; a standard GUI api that worked across platforms, and considerable simplicity and code-readability over full-effect-templates in C++ (read Alexandrescu's Modern C++ Design recently?).
A fourth capability, in the first released version of java, but not used to its fullest capacity until Java2, was the use of "interface" over abstract base classes as a means of building frameworks in the newer library components such as JDBC, Collections, and XML. C++ always had this, but "interface" is a much simpler syntax and means of expression verbally over "abstract base classes with only pure virtual functions" (the C++ version of a Java interface). Also, newer C++ libraries and designs tend to rely on templates and traits to enforce an interface (ala generic programming) rather than class design, because its easier to just write a class than to design a hierarchy -- java's "interface" took the hierarchy out of places it didn't need to be).
Now, one of those ended up an utter failure (AWT), and its replacement (Swing) though amazingly more successful as far as design and power goes, is as noted dog-slow (though its something that does get faster as machines get faster; moore's law does help Swing considerably). I personally love swing just because (when used properly) the WORA DOES work (layout management is the #1 problem for almost every bad interface out there, and that's not unique to swing; i recall a lot of bad motif layouts too from windows programmers not used to Xt's approach); also, the power in using renderers for complex components and dividing up responsibility of showing the look vs managing the data, is something i miss in any other gui library out there.
The second, the standard networking library tied to the I/O library, remains its brilliant point and the basis for Java's most successful libraries and projects, including all its server-side work. Bjarne is most impressed that the standard socket + stream library that works on ALL platforms (its the one that's most reliable in that respect of WORA) that he's planning to propose a standard socket interface to C++, though I think its now too little too late. I'm not saying that java.net and java.io aren't flawed. The use of abstract base classes for Socket and URLConnection, which likely dates to before the interface keyword was introduced is a "bad thing"; java.util.Dictionary was like this as well, but at least that's been deprecated out. Similarly, java.nio addresses most of java.io's problems, but at the considerable cost of code simplicity; if you need java.nio, it'll take a lot of time and work to use it correctly, and few books and articles are really making it clear when you actually need it. In the early stages of new technology, the "how do i use it" well-buries the "when do i need it" question.
The language itself has remained simple, with only 2 partially-incompatible changes over the years (inner classes including anonymous, and assert), and this may be its one saving grace against C# (which has a more complex syntax, but currently a much simpler library based on a cleaner syntax to most of MFC -- that will change in the future as M$ will always code-bloat their products).
The "interface" syntax is to me still Java's most powerful feature; again not in that it provides any more capabilities over C++'s abstraction (as i worded it above), but by being so simple, did more to improve inexperienced developer's OO code than any other OO syntax out there (IMHO). I'm not suprised at all that C# also chose to keep the interface keyword.
What is different in Jackson's is that instead of it having already taken place in the past, where the Aragorn they see at the Council of Elrond is all ready to take his place (with his only personal fault being the breaking of the fellowship at amon hen, quickly forgotten when Gandalf returns), the transition from loner to leader is taking place before us.
Had Jackson not done that, there would be no character development in him or most of the non-hobbits at all.
Read the book again, specifically looking at the words from Elrond and Denethor on him, and in appendix A, and you'll see that transition: Denethor's Aragorn is not the one the hobbits met in Bree. Aragorn in the books has already matured to leadership, where the Aragorn in the movie is actively maturing before us.
I for one think Jackson's version works just fine, as the alternative while a good book character would be a rather flat part in a movie.
Got a "5"
Having seen TTT now, I have to followup my post here a bit. Yes, as a whole, Jackson's take on the Aragorn story is generally true to Tolkien, just bumped up to taking place during the war for the ring instead of before it...but some of the exagerations in TTT (falling off a cliff, and Arwen actually prepared to leave middle earth) i could have done without...
however, only having a $27 mil opening week for a 100+ mil film isn't exactly "a lot of money"...and there's no followup. LOTR:TTT is gonna trounce on everything until christmas, and well into next year. Paramounts only (though probably reliable) hope for that film to break even is home video and HBO licensing. Insurrection and ST-V both became profitable (barely) to that market.
some chap wrote in reply:
LOTR us a red herring here. You can't say that one movie sucked simply because another movie is likely to do much better. They're completely different kinds of movies, with different styles, different actors and different production companies. There's no basis for comparison other than the fact that both are opening within a week of each other.
my word at the time:
I wasn't asserting that Nemesis sucked because of the upcoming TTT release, only that TTT is going to sell so well that everybody else is going to be off the map entirely, relatively speaking. Yes, Nemesis may still be #2 or #3, but the dropoff between week 1 and week 2 is going to be FAR worse than the usual 30-50% The bad reviews and a bigger hit coming out combined will knock Nemesis for a loop this next week. It won't be profitable, much less "a lot of money", until the video/dvd, if at all.
My final word on the subject:
Lord of the Rings : $61.5 million over weekend (total, 100mil)
Nemesis : last weekend 18mil, this weekend, 4.4 mil. a drop over 70%.
So I think I was right here... I predict Paramount won't bother, as this one was supposed to have captured the "magic" of ST2:Khan, whatever that is that they seem unable to duplicate. If Paramount can't figure it out, and it loses them $100+mil each time they fail, I'm guessing its time to leave the movies alone...
But then again, I don't run a studio, do I?
Oh yeah, and I'm back @ 50 karma again