Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!



"Mammoth Snow Storm" Underwhelms

acroyear bias in the question? nah... (391 comments)

the second you used the phrase "government overreach", you gave away that you are, in fact, a right-wing (or libertarian-right) jerk who has no interest in anything other than discrediting the government so you don't have to pay taxes and exist as a member of this society.

to which I say, piss off.

2 days ago

Is 'SimCity' Homelessness a Bug Or a Feature?

acroyear easy solution (393 comments)

go back to simcity version 1 or 2. homelessness wasn't a problem 22 years ago, right?

about two weeks ago

Is Kitkat Killing Lollipop Uptake?

acroyear Re:What percentage can even get it? (437 comments)

on top of that, being only a month or so old, it wasn't ready when the makers like Samsung needed to do their final packaging and testing for the Christmas season phones as all that had to happen in July through September to give the factories time to put the chip in and ship. How can I buy a 5.0 phone when the vast majority phones on the market now left the factory 3 months before 5.0 was released?

plus a 5g or 6g phone speaking 5.0 is going to be quite more expensive than the 2013 4g that as a simple matter of *hardware* is going to be fast enough for what most people throw at it, at least in the first few months. Why get a $250 (after contract) 5g phone with 5.0 on it when the 4g is only $99 and will run all the same apps just as well?

about three weeks ago

Is Kitkat Killing Lollipop Uptake?

acroyear i'm not buying a new device for the upgrade (437 comments)

if it is that much better, samsung would already be pushing the updates to my more recent devices.

one problem was the timing of the release. they put it out just a month ago, but everybody bought their Christmas toys *2* months ago, and that meant they had to be in the factory *4* months ago. It wasn't out nearly in time to make the 2014 sales, so it has no chance of an upswing until this summer or next Christmas.

about three weeks ago

What Isn't There an App For?

acroyear Sunday Morning was the wrong time to ask/post this (421 comments)

The only ones to reply so far are overly cynical, probably from having stayed up all night gaming or hacking code and likely are on their 6th cup of coffee.

about three weeks ago

Why Aren't We Using SSH For Everything?

acroyear because setting up a SSH cert is still a PitA (203 comments)

seriously, have you ever tried to get a cert installed properly in J2EE? Node? PHP/Apache? Ever tried to get PGP working right on t-bird?

There is nothing about the process that is straightforward in any way (including the cert signing stuff). Thus, most websites will simply find it easier to not bother. Let those who can pay for experts pay for it, but until expertise becomes "push this button" easy, and still almost free, it isn't worth it for typical web traffic.

about three weeks ago

Putting Time Out In Time Out: The Science of Discipline

acroyear where the hell... (323 comments)

...is Dave Brubeck in all of this?

Dear Google, I asked for "Time Out", not whatever this crap is about.

about a month ago

The Case For Flipping Your Monitor From Landscape to Portrait

acroyear How PARC first envisioned... (567 comments)

The article just talks about web-reading, and how more and more webpages are being responsive (for mobile reasons) in ways that actually now optimize sites for vertical orientation over horizontal.

However, from a coding and word processing* perspective, vertical layout is a bit better, too, as it allows you to see more of the text and the text's context, than horizontal mode does. Thus, most developers who pay attention to such things do use both, as the first 5point comment suggests above.

In fact, that was the actual vision of the PARC crew that invented GUI and WYSIWYG back in the 70s: the Xerox Alto workstation they created did have a vertical monitor, for this very reason: the idea was that if you were using a word processor to show you a page's layout, seeing the whole page on screen was the desired effect. They discovered it improved coding productivity once they were using the workstation to produce the software.

(that said, it is NOT better for spreadsheets or powerpoint, or database-editing tools, so there we are.)

about a month and a half ago

Birds Found Using Human Musical Scales For the First Time

acroyear Re:I'm sure I heard (80 comments)

So glad I'm not the only one who thought that when I read the headline.

about 3 months ago

Birds Found Using Human Musical Scales For the First Time

acroyear Re:The extent hearing is determined by physics... (80 comments)

That may be a fact, but we didn't need to know any of that. And neither do the birds.

That's the whole point of the harmonic series: our ears, in their ability to hear, can hear the overtones in a note because they are there *physically* in the sound. It is unavoidable.

"Dissonant" notes, such as a tritone or a minor 2nd, aren't dissonant because of some sine wave detail: that's just a matter of mathematical transposition and simplification. The notes are dissonant because their collective overtones within them are clashing all the way up the harmonic series as well. We hear ALL of those harmonic clashes, even if we're not conscious of it. The sine wave isn't why they are dissonant: the harmonics are.

about 3 months ago

Birds Found Using Human Musical Scales For the First Time

acroyear Re:The extent hearing is determined by physics... (80 comments)

Indeed. The nature of the relationship between the 'first' and 'fifth', the first harmonic overtone, is inherent in the actual physics of sound itself. The order of 'discovery' of the other notes of the scale inherently result from developing an ear to notice the other harmonics - it only takes finding 8 harmonics to end up with a pentatonic scale, found in almost every historical culture in the world.

It does, however, take a matter of conscious choice to actually develop the whole circle of fifths, the idea of modulation, and then the necessity of tempering the instrument - all aspects strictly of western tonality (with the resulting western a-tonality that followed in the 20th century: our atonality is actually still a limitation of our instruments of choice and the temperament they inherited).

I don't expect the birds to actually get that far...and even if they did, we'd just be accusing them of impersonating Messiaen's works.

about 3 months ago

Ask Slashdot: Where Do You Stand on Daylight Saving Time?

acroyear Where I stand... (613 comments)

...is in a place where, after 15 years of /., I am sick and tired of having this very same, and pointless (since nobody ever changes anybody's minds here), discussion twice a year, every year, like clockwork.

about 3 months ago

The Greatest Keyboard Ever Made

acroyear "greatest keyboard" bull (304 comments)

I was stuck with one on a DEC Alpha back in '94. Damn thing gave me carpal tunnel in a matter of days, and I refused to use the box until they replaced it with the contemporary version of the keyboards off a VT320.

about 4 months ago

End of an Era: After a 30 Year Run, IBM Drops Support For Lotus 1-2-3

acroyear Re: The decline started with OS/2 (156 comments)

It disappeared because Micro$oft was then well into their bundled-packages for OEMs, where they would offer huge discounts on Office (or even just Works) to OEMs to pre-install on Windows installations. If you already had Office, why go hunting for 123 anymore?

Yes, this is one of several things that the 90s era anti-trust lawsuits had intended to put a stop to, and was a much bigger issue in the European cases than the American DOJ case that was focused primarily on the browser market.

By the time the suits were resolved, however, the damage had already been done and Office (and IE) was the only player on the market. The browser situation recovered with Firefox and webkit. The Office suite market never did, and only now with people looking at cloud solutions like Google Drive is there starting to get some pick-up at Office alternatives.

about 4 months ago

Kickstarter's Problem: You Have To Make the Game Before You Ask For Money

acroyear Yeah, so? (215 comments)

Indeed. Kickstarter is not the initial funding to turn idea into prototype - that is always personal or corporative debt and always will be.

Kickstarter replaces the VCs as the means by which the prototype can become a marketable and distributable product.

about 5 months ago

Microsoft Considered Renaming Internet Explorer To Escape Its Reputation

acroyear Re:"not my job" - It is *Microsoft's* job (426 comments)

Object.observe() - new built-in mechanism for getting change events when a plain javascript object changes (new property, deleted property, changed property, deleted). Chrome 36 is the first browser to get it, but it is an established part of the html5 standard.

Greatly simplifies a lot of the work the model layers have to do, and eventually means that many frameworks will over time get rid of their setter/getter mechanisms and all the weight they have.

about 5 months ago

Microsoft Considered Renaming Internet Explorer To Escape Its Reputation

acroyear Re:"not my job" - It is *Microsoft's* job (426 comments)

Yes, CSS tends to be my biggest issue. Some aspects of IE events are still wrong. How IE decided that the "unresponsive script" dialog should come up was horrid, constantly causing issues in sproutcore/ember apps.

Some of these things are better. They are getting closer to the standards. My take remains: it is their job to finish that, not my job to meet them half way anymore. If my stuff works on IE, great, but I'm not and will never go out of my way to make it work anymore. I personally will never actually run my personal apps in IE, ever. I have 1 Windows7 box left in the house (and 2 XP boxes are now Ubuntu); that box got Chrome on it as the very first thing.

(yes, professionally others in my company are doing that with our primary product. I professionally support that decision, though all I am doing as part of that effort is code-reviewing. These opinions are for my own personal projects independent of all of that.)

A lot of all of this is what and where is your market. A big corporation targeting big corporations is free to spend that kind of money. But to do that, M$ needs to stop acting like they want to attract 'developers' as the original article implied. M$ does that by attracting execs and having such decisions forced on the developers from on high. Just as they've done in so many other things.

A small development shop targeting a more general user has no need to worry about IE compatibility as their primary concern. Wait for the money to actually be there (or at least be promised) before you invest in that type of thing.

about 5 months ago

Microsoft Considered Renaming Internet Explorer To Escape Its Reputation

acroyear Re:"not my job" - It is *Microsoft's* job (426 comments)

Microsoft's ability to control de facto standards is long gone. When Netscape collapsed but Firefox was too immature, they had control. The Standards bodies worked to make html4 and Microsoft could happily ignore all of that, create their own broken implementation of CSS (this is worse than their jscript problems) and happily make everybody else play along.

Now what they want is that same control back, but as Disney once said, "That ship has sailed." This idea of pushing to developers to create specialized webapps that are IE compatible betrays the fact that they have not been in control for years now.

I am not going to code to IE compatibility; I'm not going to break my application by trying to get it passed IE's continually broken CSS. I code to the standards because the standards are the greatest likelihood that my application, if converted to a full-out mobile app via Cordova, or converted to a desktop-app via Adobe Air, will continue to function without change. My stuff mostly works in IE. It is up to IE to finish their job and fix their CSS and their event model and all of that stuff. I'll come part-way by using certain libraries that others are willing to address compatibility issues (jquery, for example) but no closer.

about 5 months ago

Microsoft Considered Renaming Internet Explorer To Escape Its Reputation

acroyear "not my job" - It is *Microsoft's* job (426 comments)

As much as Enterprise customers like to push the "it has to work on IE" crap (because they're usually working with lazy IT departments or legacy applications written by people with less interest in standards compliance than me), in reality that shouldn't be my job for writing a web application. I code to the standards or I use libraries and frameworks that code to the standards. These work in Firefox, Safari, and Chrome with minimal modification (assuming I'm not using a cutting-edge new feature like web audio, notifications, or O.o()) and impressive consistency.

They never work in IE without modification.

That's not my fault. That will never be my fault.

If you want to court developers, you go out there with IE, pick apps that have not gotten IE-fixing mods, and YOU (Micro$oft) fix the browser to the standards-compliant web applications already out there.

I'm sick of and done with working around your messes for the last 15 years.

about 5 months ago

Math, Programming, and Language Learning

acroyear Re: Your Results Will Vary (241 comments)

When a user interface is in, say, HTML5/DOM, one needs to know a lot about controlling how often the browser reloads the page or a portion of it, how much CSS is "too much" before the page rendering is slower than tolerable, how to control a loop to minimize server calls and DOM redraws, and much more. Similarly, a lot of UI is search optimization, deciding how much power of the search goes into the server/db and how much in the UI, then how to find and highlight the search results - all of these are complex processes that can be painfully slow or dramatically fast, but only if you know what bottlenecks to expect.

Anybody can draw a UI on a piece of paper and bullcrap the thing up in jquery-ui or xcode. Making a UI that is functional amidst constantly changing requirements involving ever increasing quantities of data is much harder. Writing components or maintaining frameworks and understanding the decisions the framework designers made (so that you don't violate them, often with painful results in stability or performance, when using them or adding to them) is a huge part of REAL UI work. I have yet to meet a toolkit I didn't need to fix or amend to, in 20 years of UI work. I have yet, in any of those libraries/frameworks, to NOT have Big-O decisions to make when doing so in pieces involving components meant to handle large amounts of data like grids and graphs.

No, it doesn't happen every day. But it happens at least every project.

If you don't speak basic optimization rules/practices like O(N) vs O(log(N)) (and yes, that HAS come up more than once in my work in the last 5 years, nevermind the last 20 when I was doing more J2EE or MoM/Agents work), then I can't trust that you have any discipline to anything else. There's no point in even talking anymore if you can't speak what I consider to be a basic common CS vocabulary.

Yes, much of this is trust: the piece of paper that is a degree is something of a document of trust that you have had certain disciplines and experiences, things that I can't actually get directly from your work prior since I'm not actually allowed to ask to see your code in an interview or before-hand - that usually belongs to your former company.

We don't ask for higher maths because we expect you to use them; your 'experience' during your degree work is almost meaningless in-and-of itself. We ask for higher maths, and for the other aspects of a degree, because the degree is evidence of experience, motivation, and most importantly discipline.

And for many of the "I don't need maths blah blah blah a trade school is fine blah blah blah" posts on this thread (and this same thread every time this topic comes up, which is probably twice a year for the last 18 years I've been on /.), discipline is something I actually see sorely lacking in the comments.

about 6 months ago



acroyear acroyear writes  |  more than 7 years ago

acroyear writes "A court in Belgium found that Google's website caching policies are a violation of that nation's copyright laws, claiming that Google's cache offers effectively "free" access to articles that, while free initially, are archived charged for via subscriptions. Google claims that they only store "short extracts, but the court determined that's still a violation. The court found, "We confirm that the activities of Google News, the reproduction and publication of headlines as well as short extracts, and the use of Google's cache, the publicly available data storage of articles and documents, violate the law on authors' rights.""



still more wild on the future commentary ;-)

acroyear acroyear writes  |  about 12 years ago

Two parts to "intelligence" by human standards

There's really two things to consider in producing another creature like humans that visibly demonstrate their intelligence by manipluting their environment to suit them and assert some mastery (domestication) of lesser animals.

One of course, is actually having the brains and a complex means of communicating ideas besides "bad", "good", "i like that" and "i fear that". The other is the means of manipulating the environment, and tools. And we still don't clearly understand how this happened to us to be able to properly predict how another spinoff creature might reach the same point. We see signs of growing tool use in other primates now, but much of that is a result of them watching us do it ("monkey see monkey do"), and as such can't be relied on to make the type of predictions the show was aiming for.

Certainly ant and bee colonies are a clear demonstration of how an individual in a group can be an idiot and yet through signals sent by messenger individuals (utterly unaware of what they're carrying) the colony can show massive awareness and adaptation to situations in its environment. In other words, the colony shows signs of consciousness and individuality even as the individual insect shows none. The special decided to look at the extensions of such colonies in jellyfish and spiders.

I think another fault might have been ignoring the fact that our metal and buildings and trash would still be in some parts of the environment...it would have been nifty to see what might have evolved as a result of the situation of an abandoned city falling apart.


"It's not like all animals will grow really big if nothing killed them." -- not all, but some will, and if sticking your neck out gets you food others can't get, then long necks will likely evolve. The problem is that most creatures today eat grass (no need for a long neck unless the body is too tall like elk and moose) or climb up into the trees they eat from. For a turtle to evolve a long neck, it would have had to have found some reason to not have evolved grass-eating which is nature's current direction of things since grass is so prevelant (even giraffes can deal with grasses, though they prefer leaves).

"why would sharks suddenly have a need for bioluminescence" -- perhaps a lack of food on the surface of the oceans due to excessive heat or cold? Although some creatures (like cats) have been steadily gaining some degree of infrared heat-sensitivity to deal with the dark...

"remember how raptors hunt in packs and use logic" -- I still consider that speculation at this point...believable, but not enough to base an argument on...particularly not after having based a movie on it ;-)

"Also squid have existed for millions of years and have remained pretty much unchanged" -- I would agree with this assesment. some creatures have definitely shown an end-of-the-road aspect of their evolutionary existance (as in, they're unchanged from pre-dinosaur times) and I don't see nature trying again with the invertibrates as long as there are vertibrates who can eat them.


on the show's prediction of the elimination of mammals:

yes, the big cats are disappearing, mostly due to us hunting them or taking away their own hunting and breeding grounds, but the small cats, particularly the domestic cats, are tremendously well adapted hunters of small insects and rodents, and given that most mass-extinctions tend to take out the larger animals, I can't see the food supply of cats ever disappearing...and small cats aren't hunted by anybody (you don't get much eating a meat-eater, and nature knows that).

Right now, we're the biggest killer of cats (roadkill) and the biggest danger to their extinction (eventually we might get stupid and "fix" the last domestic cat and poof! no more cats).


Hitting digital limitations? Go Analog!

acroyear acroyear writes  |  about 12 years ago In reply to this weblog entry on hitting DRM limitations, I wrote

There's one way they can't stop you, though there's a minor loss in quality (give or take...most heavy-compression audio formats already are lossy, so it really can't get any worse): run it through an analog mode on the way into a computer. Music won't really be "CD-quality", but for spoken word, who's going to notice. And this works fine for me when I'm making (personal use only) CDs from DVD and VHS concerts (that I own -- no piracy here you RIAA jackasses).

Basically, what you do, requiring either 2 computers, or an intervening high quality tape recorder or the phillips cd-recorder thingy, is wire one computer's "speaker" jack (not the headphone jack, as that's amplified and will distort) into the input (again, not the amplified "mic" jack) of another computer, play with the volume and input settings 'til you don't see red, then with appropriate software (n-track studio works for me on my win98 laptop @ home) record it into .wav files. With something in .wav mode, you can do anything you want. Remember to record into 16bit, 44.1 stereo to match the CD rating.

If you don't have 2 computers handy, you can use a tape recorder with Dolby-B to have a temp copy that you play back into the computer, or a digital tape recorder will suffice even better, but again, using the analog i/o lines instead of trying an all-digital solution. Keep in mind that an audio cd holds 650-700 (74min or 80min) plus of space, so you'll need a lot of free disc space, on a single partition, to make the .wav files.

Mind you, with any CD already cut, you can copy that independently, so the whole "you can only make one cd out of this" is a stupid and useless. But I read and noticed you already noted that flaw in their logic...

It really comes down to this: any one audio package by itself isn't enough to do all the audio you want to do, in order to stay DRM compliant, but the right combination of tools, and the willingness to pass through an analog mode on the way, will allow you to do anything.


More comments on nature/future

acroyear acroyear writes  |  about 12 years ago

Large Animals, Little Food, happened before...

The indricothere, of the Rhino family, managed to survive for a few cool million eating from rare trees in scrubbrush land, dealing with wet and dry seasons much like the current african savannah (only without the grass).

However, there are enough grass-eaters out there that I don't think evolution's going to produce new tree-eaters except ones that eat fruit as part of a omnivorous diet (like primates today).

Nature's produced grass-eaters (including ourselves, though we can only deal with cooking the seeds -- wheat and barley are grasses) in quite a few mammal and lizard & salamander families, and I don't see the grass-eaters giving up and tree leaves being the core of the future diet again. Tree leaves are a luxury item, like giving carrots and/or radishes to a rabbit, but more mammals still need a steady regimen of grains.


Evolution requires mutation, and THAT's what's unpredictable

acroyear acroyear writes  |  about 12 years ago

A comment I just left in the "Future Is Wild" boards:

That Darwin's theory explains why things are the way they are, with regards to survival, it doesn't explain the HOW, which is mutation. Mutations occur and natural selection drives the duplication of the mutated genes 'til a new species is differentiated from the old.

However, the nature of how mutations really happen, and how "good" ones that are "prefered" arrive (as we're very keyed in to hating anything "different" ourselves and often shun it in humans or kill it in animals) is what we as humans have not been able to truly see or test. Its hard to test, as mammals have too long a breeding period, and colonial insects (ants and bees) are usually dominated by the queen's genes. Most genes that change behaviours tended to have already been on the planet somewhere, and are only spreading now because we're accidentally spreading them (e.g., "africanized/killer" bees).

The show did a good job of suggesting what natural selection might do, given a set of mutations over X million years to produce said animals, but the fact is that the mutations themselves are what's utterly unpredictable...and truth be told, rather boring by comparison to the end-results we saw.

I consider evolution a fact, but not a law in the Newton/Einstein sense, because evolution can't be used to predict the future with any accuracy since evolution doesn't explain mutations; it only relies on them. It would be like trying to use Einstein to predict something in electrons without the use of calculus.


Inventing the Future is Wild

acroyear acroyear writes  |  about 12 years ago Not a bad show, though I personally disagree that the mammals will be extinct in 100 million years. Even as substantial a hit that killed the dinosaurs wasn't enough to eliminate ALL reptiles or birds, and the little ones certainly got big within 10 million years (crocs and gastornis, e.g.) as did mammals coming up from basic rodent-like things. Certainly, I can accept that some subsets will die out like many already have, but I don't forsee all of them going. If insects and fish will continue to thrive and get tougher, mammals will get tougher just like they always have.

The series seems to imply that organized societal creatures (jellyfish, spiders, like ants today) will be the big survivors and that individually-thinking creatures won't make it, and this desire to limit the effect and adaptability of the individual intelligence I feel is misdirected.


Who Voted for JarJar?

acroyear acroyear writes  |  more than 12 years ago In IMDB the poll today (Dec 27, 2002) had the question of "which mystery in a 2002 movie would you like to know the answer to?", with things like why does Dumbledore pick such lousy faculty (actually, in books 3 and 4, the dark arts teachers are great...but then again, the predictions teacher and the history guy who only talks about goblins revolutions is a bit much). The current winner as of 9am is "who voted for JarJar to be in the Senate".

Sorry, guys, but that's an easy one. As a result of the new alliance/union of the Gungans and the Naboo, the Naboo offered one of their seats to the Gungans. The Gungans voted for Jar Jar. Unanimously. To get him off the damn planet.



Past Slashdot comment on Java and C++

acroyear acroyear writes  |  more than 12 years ago In reply to someone's comment saying "Java's a gorgeous language [...], it's the language itself that won developers' hearts", I wrote:

Actually, the language itself isn't the greatest. It won over c++ developers because of 3 reasons : a standard network library, closely tied to a (reasonably) well-designed, truly OO I/O library; a standard GUI api that worked across platforms, and considerable simplicity and code-readability over full-effect-templates in C++ (read Alexandrescu's Modern C++ Design recently?).

A fourth capability, in the first released version of java, but not used to its fullest capacity until Java2, was the use of "interface" over abstract base classes as a means of building frameworks in the newer library components such as JDBC, Collections, and XML. C++ always had this, but "interface" is a much simpler syntax and means of expression verbally over "abstract base classes with only pure virtual functions" (the C++ version of a Java interface). Also, newer C++ libraries and designs tend to rely on templates and traits to enforce an interface (ala generic programming) rather than class design, because its easier to just write a class than to design a hierarchy -- java's "interface" took the hierarchy out of places it didn't need to be).

Now, one of those ended up an utter failure (AWT), and its replacement (Swing) though amazingly more successful as far as design and power goes, is as noted dog-slow (though its something that does get faster as machines get faster; moore's law does help Swing considerably). I personally love swing just because (when used properly) the WORA DOES work (layout management is the #1 problem for almost every bad interface out there, and that's not unique to swing; i recall a lot of bad motif layouts too from windows programmers not used to Xt's approach); also, the power in using renderers for complex components and dividing up responsibility of showing the look vs managing the data, is something i miss in any other gui library out there.

The second, the standard networking library tied to the I/O library, remains its brilliant point and the basis for Java's most successful libraries and projects, including all its server-side work. Bjarne is most impressed that the standard socket + stream library that works on ALL platforms (its the one that's most reliable in that respect of WORA) that he's planning to propose a standard socket interface to C++, though I think its now too little too late. I'm not saying that java.net and java.io aren't flawed. The use of abstract base classes for Socket and URLConnection, which likely dates to before the interface keyword was introduced is a "bad thing"; java.util.Dictionary was like this as well, but at least that's been deprecated out. Similarly, java.nio addresses most of java.io's problems, but at the considerable cost of code simplicity; if you need java.nio, it'll take a lot of time and work to use it correctly, and few books and articles are really making it clear when you actually need it. In the early stages of new technology, the "how do i use it" well-buries the "when do i need it" question.

The language itself has remained simple, with only 2 partially-incompatible changes over the years (inner classes including anonymous, and assert), and this may be its one saving grace against C# (which has a more complex syntax, but currently a much simpler library based on a cleaner syntax to most of MFC -- that will change in the future as M$ will always code-bloat their products).

The "interface" syntax is to me still Java's most powerful feature; again not in that it provides any more capabilities over C++'s abstraction (as i worded it above), but by being so simple, did more to improve inexperienced developer's OO code than any other OO syntax out there (IMHO). I'm not suprised at all that C# also chose to keep the interface keyword.


My Slashdot comment on Aragorn's story in Jackson's LOTR

acroyear acroyear writes  |  more than 12 years ago Well, I think many are misreading Jackson's take on the Aragorn story a bit. The development from young man who wants nothing to do with responsibility and kings and crowns and gondor and just wants to hang out in the north with his ranger buddies and occasionally come into rivendell and sweet-talk Arwen, into a mature responsible leader ready to fight the worst of the worst and rule the entire free world (in kindness) IS in the book...its just all done in 3rd-party recollections and in appendix A; that is, its already happened before Frodo meets him. It IS in Tolkien's story.

What is different in Jackson's is that instead of it having already taken place in the past, where the Aragorn they see at the Council of Elrond is all ready to take his place (with his only personal fault being the breaking of the fellowship at amon hen, quickly forgotten when Gandalf returns), the transition from loner to leader is taking place before us.

Had Jackson not done that, there would be no character development in him or most of the non-hobbits at all.

Read the book again, specifically looking at the words from Elrond and Denethor on him, and in appendix A, and you'll see that transition: Denethor's Aragorn is not the one the hobbits met in Bree. Aragorn in the books has already matured to leadership, where the Aragorn in the movie is actively maturing before us.

I for one think Jackson's version works just fine, as the alternative while a good book character would be a rather flat part in a movie.

Got a "5" :)

Later on...

Having seen TTT now, I have to followup my post here a bit. Yes, as a whole, Jackson's take on the Aragorn story is generally true to Tolkien, just bumped up to taking place during the war for the ring instead of before it...but some of the exagerations in TTT (falling off a cliff, and Arwen actually prepared to leave middle earth) i could have done without...


Occasionally, I know movies...

acroyear acroyear writes  |  more than 12 years ago In another mailing list, talking on ST:Nemesis, some chap suggested that there will be another ST movie 'cause it "makes a lot of money" for Paramount. In reply, I wrote:

however, only having a $27 mil opening week for a 100+ mil film isn't exactly "a lot of money"...and there's no followup. LOTR:TTT is gonna trounce on everything until christmas, and well into next year. Paramounts only (though probably reliable) hope for that film to break even is home video and HBO licensing. Insurrection and ST-V both became profitable (barely) to that market.

some chap wrote in reply:

LOTR us a red herring here. You can't say that one movie sucked simply because another movie is likely to do much better. They're completely different kinds of movies, with different styles, different actors and different production companies. There's no basis for comparison other than the fact that both are opening within a week of each other.

my word at the time:

I wasn't asserting that Nemesis sucked because of the upcoming TTT release, only that TTT is going to sell so well that everybody else is going to be off the map entirely, relatively speaking. Yes, Nemesis may still be #2 or #3, but the dropoff between week 1 and week 2 is going to be FAR worse than the usual 30-50% The bad reviews and a bigger hit coming out combined will knock Nemesis for a loop this next week. It won't be profitable, much less "a lot of money", until the video/dvd, if at all.

My final word on the subject: Lord of the Rings : $61.5 million over weekend (total, 100mil)
Nemesis : last weekend 18mil, this weekend, 4.4 mil. a drop over 70%.

So I think I was right here... I predict Paramount won't bother, as this one was supposed to have captured the "magic" of ST2:Khan, whatever that is that they seem unable to duplicate. If Paramount can't figure it out, and it loses them $100+mil each time they fail, I'm guessing its time to leave the movies alone...

But then again, I don't run a studio, do I?


acroyear acroyear writes  |  more than 13 years ago Used to be the groucho one, but now i've changed to one from a first-season simpsons episode.

Oh yeah, and I'm back @ 50 karma again ;-)

Slashdot Login

Need an Account?

Forgot your password?