Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!



Kickstarter's Problem: You Have To Make the Game Before You Ask For Money

acroyear Yeah, so? (211 comments)

Indeed. Kickstarter is not the initial funding to turn idea into prototype - that is always personal or corporative debt and always will be.

Kickstarter replaces the VCs as the means by which the prototype can become a marketable and distributable product.

about a week ago

Microsoft Considered Renaming Internet Explorer To Escape Its Reputation

acroyear Re:"not my job" - It is *Microsoft's* job (426 comments)

Object.observe() - new built-in mechanism for getting change events when a plain javascript object changes (new property, deleted property, changed property, deleted). Chrome 36 is the first browser to get it, but it is an established part of the html5 standard.

Greatly simplifies a lot of the work the model layers have to do, and eventually means that many frameworks will over time get rid of their setter/getter mechanisms and all the weight they have.

about a month ago

Microsoft Considered Renaming Internet Explorer To Escape Its Reputation

acroyear Re:"not my job" - It is *Microsoft's* job (426 comments)

Yes, CSS tends to be my biggest issue. Some aspects of IE events are still wrong. How IE decided that the "unresponsive script" dialog should come up was horrid, constantly causing issues in sproutcore/ember apps.

Some of these things are better. They are getting closer to the standards. My take remains: it is their job to finish that, not my job to meet them half way anymore. If my stuff works on IE, great, but I'm not and will never go out of my way to make it work anymore. I personally will never actually run my personal apps in IE, ever. I have 1 Windows7 box left in the house (and 2 XP boxes are now Ubuntu); that box got Chrome on it as the very first thing.

(yes, professionally others in my company are doing that with our primary product. I professionally support that decision, though all I am doing as part of that effort is code-reviewing. These opinions are for my own personal projects independent of all of that.)

A lot of all of this is what and where is your market. A big corporation targeting big corporations is free to spend that kind of money. But to do that, M$ needs to stop acting like they want to attract 'developers' as the original article implied. M$ does that by attracting execs and having such decisions forced on the developers from on high. Just as they've done in so many other things.

A small development shop targeting a more general user has no need to worry about IE compatibility as their primary concern. Wait for the money to actually be there (or at least be promised) before you invest in that type of thing.

about a month ago

Microsoft Considered Renaming Internet Explorer To Escape Its Reputation

acroyear Re:"not my job" - It is *Microsoft's* job (426 comments)

Microsoft's ability to control de facto standards is long gone. When Netscape collapsed but Firefox was too immature, they had control. The Standards bodies worked to make html4 and Microsoft could happily ignore all of that, create their own broken implementation of CSS (this is worse than their jscript problems) and happily make everybody else play along.

Now what they want is that same control back, but as Disney once said, "That ship has sailed." This idea of pushing to developers to create specialized webapps that are IE compatible betrays the fact that they have not been in control for years now.

I am not going to code to IE compatibility; I'm not going to break my application by trying to get it passed IE's continually broken CSS. I code to the standards because the standards are the greatest likelihood that my application, if converted to a full-out mobile app via Cordova, or converted to a desktop-app via Adobe Air, will continue to function without change. My stuff mostly works in IE. It is up to IE to finish their job and fix their CSS and their event model and all of that stuff. I'll come part-way by using certain libraries that others are willing to address compatibility issues (jquery, for example) but no closer.

about a month ago

Microsoft Considered Renaming Internet Explorer To Escape Its Reputation

acroyear "not my job" - It is *Microsoft's* job (426 comments)

As much as Enterprise customers like to push the "it has to work on IE" crap (because they're usually working with lazy IT departments or legacy applications written by people with less interest in standards compliance than me), in reality that shouldn't be my job for writing a web application. I code to the standards or I use libraries and frameworks that code to the standards. These work in Firefox, Safari, and Chrome with minimal modification (assuming I'm not using a cutting-edge new feature like web audio, notifications, or O.o()) and impressive consistency.

They never work in IE without modification.

That's not my fault. That will never be my fault.

If you want to court developers, you go out there with IE, pick apps that have not gotten IE-fixing mods, and YOU (Micro$oft) fix the browser to the standards-compliant web applications already out there.

I'm sick of and done with working around your messes for the last 15 years.

about a month ago

Math, Programming, and Language Learning

acroyear Re: Your Results Will Vary (241 comments)

When a user interface is in, say, HTML5/DOM, one needs to know a lot about controlling how often the browser reloads the page or a portion of it, how much CSS is "too much" before the page rendering is slower than tolerable, how to control a loop to minimize server calls and DOM redraws, and much more. Similarly, a lot of UI is search optimization, deciding how much power of the search goes into the server/db and how much in the UI, then how to find and highlight the search results - all of these are complex processes that can be painfully slow or dramatically fast, but only if you know what bottlenecks to expect.

Anybody can draw a UI on a piece of paper and bullcrap the thing up in jquery-ui or xcode. Making a UI that is functional amidst constantly changing requirements involving ever increasing quantities of data is much harder. Writing components or maintaining frameworks and understanding the decisions the framework designers made (so that you don't violate them, often with painful results in stability or performance, when using them or adding to them) is a huge part of REAL UI work. I have yet to meet a toolkit I didn't need to fix or amend to, in 20 years of UI work. I have yet, in any of those libraries/frameworks, to NOT have Big-O decisions to make when doing so in pieces involving components meant to handle large amounts of data like grids and graphs.

No, it doesn't happen every day. But it happens at least every project.

If you don't speak basic optimization rules/practices like O(N) vs O(log(N)) (and yes, that HAS come up more than once in my work in the last 5 years, nevermind the last 20 when I was doing more J2EE or MoM/Agents work), then I can't trust that you have any discipline to anything else. There's no point in even talking anymore if you can't speak what I consider to be a basic common CS vocabulary.

Yes, much of this is trust: the piece of paper that is a degree is something of a document of trust that you have had certain disciplines and experiences, things that I can't actually get directly from your work prior since I'm not actually allowed to ask to see your code in an interview or before-hand - that usually belongs to your former company.

We don't ask for higher maths because we expect you to use them; your 'experience' during your degree work is almost meaningless in-and-of itself. We ask for higher maths, and for the other aspects of a degree, because the degree is evidence of experience, motivation, and most importantly discipline.

And for many of the "I don't need maths blah blah blah a trade school is fine blah blah blah" posts on this thread (and this same thread every time this topic comes up, which is probably twice a year for the last 18 years I've been on /.), discipline is something I actually see sorely lacking in the comments.

about 2 months ago

Math, Programming, and Language Learning

acroyear Re: Your Results Will Vary (241 comments)

"while neglecting critical skills that could be immediately useful in a large ."

Lack of college seems to have neglected that critical skill of finishing a sentence.

Or was that too easy?

p.s. :)

about 2 months ago

Math, Programming, and Language Learning

acroyear Re: Your Results Will Vary (241 comments)

In the end, I've always concluded the same, every year this topic comes up (it is just about annual here). I don't want you to know calculus because you'll actually use calculus on the job (though I have an O(N) vs O(log(N)) question on my interviews that you'd better be able to answer).

I want you to know/pass calculus because by the time you've worked that hard at that level of proofs, you've mastered *variable control*. In each level of math, you don't get out of the class having mastered it. You get out of the class having mastered the year that came before it. When you can pass calculus, you have *mastered* functions and analytic geometry (I hire for UI work). Passing those two the year before showed you mastered variable manipulation and proofs in Algebra 2 before it.

So no, you don't *need* calculus on the job, but you need everything below it, and the best proof that you've *mastered* them (not just taken them, but mastered them) is that you've passed a calculus class.

about 2 months ago

Was Turing Test Legitimately Beaten, Or Just Cleverly Tricked?

acroyear given we're nerds who know this stuff... (309 comments)

...why didn't /. just wait for the skeptical posts calling the original news articles bullshit in the first place?

Seriously, weeding out the garbage posts, 3/4ths of the comments were calling bullshit when they saw it, and 1/4th were making pointless references to Skynet and HAL.

about 3 months ago

Turing Test Passed

acroyear Re:Garbage (432 comments)

If anything, perhaps it really shows how *un*intelligent our average casual conversation really is.

about 3 months ago

Turing Test Passed

acroyear Garbage (432 comments)

All it showed, like any other Turing Test, is the gullibility of the subjects.

1) "Ukrainian" speaking English
2) 13 years old

Right there you have set up an expectation in the audience of subjects for a limited vocabulary, no need for grammatical perfection, little need for slang, and a lack of education. Now add in "star wars and matrix" and you have reduced the topics of discussion even more to the ones the programmers know best.

This thing would never have answered a question of 'Why', it also was under no pressure to being able to create a pun, both of which are easy things any older and educated human could do.

Garbage test, garbage results.

As usual.

about 3 months ago

Ask Slashdot: What Should Every Programmer Read?

acroyear Martin Fowler's Refactoring (352 comments)

No, in spite of what some jackasses say, it isn't just rewriting for its own sake. It is improving the structure in standardized ways so that you can add your new features much more safely. In interviews, I prefer people can name some standard refactorings before I ask them the typical questions about design patterns.

about 4 months ago

How 'DevOps' Is Killing the Developer

acroyear Re:Oh please... (226 comments)

Please fuck off. No, really. Just fuck off.

I was typing in a fucking hurry. Its a fucking comments page of a fucking blog (one i have been fucking typing on since 19-fucking-98), not the fucking New York Times. if you want fucking capitalization to be fucking correct you can fucking type it up your fucking self because I don't give a fuck.

I presume I have expressed my attitude adequately enough? And capitalized New York Times correctly as well?

about 5 months ago

How 'DevOps' Is Killing the Developer

acroyear Re:It's just like when agile went hip... (226 comments)

Well I would counter that a DevOps team is needed even for customer-deployed or self-run cloud apps (where you're running the cloud in a data center instead of using services like NetApp or AWS). The jobs they do are useful there, in fact more so as staging DBs and environments for QA is harder when you're running all aspects of the cloud layer (though with customer-managed old-school desktop applications, QA is more responsible for their deployments, since testing installation instructions becomes part of their job).

But otherwise, yeah, treating DevOps like a panacea that will solve everything (thus necessitating hiring a DevOps manager who is 'smarter' than your lead architect, at possibly outrageous wages or options) is asking for failure.

about 5 months ago

How 'DevOps' Is Killing the Developer

acroyear Oh please... (226 comments)

You don't force your brightest people to take on additional roles - that is the whole point of a devops team in the first place. Making developers argue about deployments and sending builds to QA and managing your GIT server and development and QA databases and managing your bug tracker is exactly what your developers should *not* be doing, especially if those scripts necessarily have to be in a different language than your application. Sure, your lead developers and architects work with the devops team to support them so they can in turn support you, but that's as far as the relationship goes.

The way we used to do it, where every senior architect is also responsible for all of those other functions (and has to take the time from his team members below him to help build all that out), is exactly how you stop architecting your software: your leads spend so much time trying to automate the drudgery they aren't improving the app.

They aren't improving the app because all of their brainpower is no longer focused on the *customers'* problems, but rather their own and their teams. That isn't a good use of their time. Hiring smart people who need to understand the application and its environment, but are good at scripting these other languages of automation, frees up your team leads to doing what they did before and do best: focus on the application and getting the team to produce the code that serves the customers' needs.

about 5 months ago

OKCupid Warns Off Mozilla Firefox Users Over Gay Rights

acroyear Re:They should use Firefox while *mocking* Eich (1482 comments)

So when the executive (or the DoJ) mounts a 'weak' defense, thus ensuring that the courts will rule against the law?

There's plenty of ways around absolute declarations like you're suggesting.

I'm not saying you're wrong in itself, just that there are far too many angles and ways around any sort of absolute position that the executive *must* do something.

In the case of California (or more recently, Virginia), when there is a major shift in the controlling party, it gets even more complicated. The referendum in VA was passed in 2006 under Republican leadership. The defense of the law became (in 2013) the responsibility of the new Democratic party Governor and the Democratic party Attorney General. What then is the responsibility of those officers to defend a law they never supported in the first place, particularly when they were elected on a platform encouraging equality?

The responsibility to defend (or not) a law may be the role of the executive, but at the same time, the executive, as a representative of the people, must act per the fact that they were, in fact, elected by the majority of the people. As such, their decisions not to enforce, or more accurately not to aggressively defend, a law they believe to be Unconstitutional is a reflection of the faith in their office given to them by the majority that elected them.

If they do a poor job of it, they can just as easily be kicked out 4-6 years later in the next election.

That said, it would certainly have been easier on everybody if such blatantly Unconstitutional referendums weren't so easily passed by the states in the first place. A constitution, even a state-level one, shouldn't be so easily amended by a simple majority of whomever decides to vote one particular day. The U.S. Constitution is a pain to amend *because* it is meant to be immune to fly-by-night sentiments (prohibition being the one oddball exception to its history).

about 6 months ago

OKCupid Warns Off Mozilla Firefox Users Over Gay Rights

acroyear Re:They should use Firefox while *mocking* Eich (1482 comments)

but if it sincerely is unconstitutional, what worth is a Governor's (or President's) oath to "Defend the Constitution"?

Refuse it and you are breaking a responsibility of your office. Defend it and you are breaking your oath of office itself?

The legislature, and certainly not "the people" (in the case of a referendum) shouldn't have so much power to put an executive in such a bind.

about 6 months ago

OKCupid Warns Off Mozilla Firefox Users Over Gay Rights

acroyear They should use Firefox while *mocking* Eich (1482 comments)

It's all good and well that he spent $1,000 of his own money on legalizing bigotry.

Look at it from the other side of history: he *wasted* $1,000 for pretty much no reason at all, now that the courts have asserted that the ban was unconstitutional.

about 6 months ago

Million Jars of Peanut Butter Dumped In New Mexico Landfill

acroyear but expensive to defend (440 comments)

CostCo still would be out millions of dollars defending itself, even if it won every case and every appeal. You don't get paid back 100% of your legal costs by the government when you win.

about 6 months ago

Ask Slashdot: What Do You Consider Elegant Code?

acroyear My personal "law" (373 comments)

Code written to do one thing is inherently elegant.

No code ends up ever having to do one thing.

The job of requirements gathering is to determine what are the constants and what are the variables. In the case of, say, GPS, the constants should be the protocol of the satellites, the max and min # of sat's that can be found at any given time, the grid representation of the earth, and the system clocks.

Nice and easy, right?

Now change all of those to a variable: you have satellites speaking to you in different protocols based on their age. You end up with only one sat connection so no triangulation due to mountain or building blockage. The grid representation of the earth is inherently distorted at the north and south extremes (and whenever you're above 5,000 feet). Oh, and the you forgot to time-distort your own clock for the rotation of the earth, so a tiny offset is being caused by General Relativity.

Suddenly code that was nice and simple is now full of ifs and switch loops and complex adjustments and bits of guess work and comments that say "oh, well, we'll just have to ignore that last part...but we'll only be off by 30 feet or so".

The first bug in software happens when something that was presumed in the requirements to be a constant has to be changed into a variable. Every bug that follows is a result of trying to fix that first bug.

Because of that requirements problem, no working production code can ever be elegant.

about 6 months ago



acroyear acroyear writes  |  more than 7 years ago

acroyear writes "A court in Belgium found that Google's website caching policies are a violation of that nation's copyright laws, claiming that Google's cache offers effectively "free" access to articles that, while free initially, are archived charged for via subscriptions. Google claims that they only store "short extracts, but the court determined that's still a violation. The court found, "We confirm that the activities of Google News, the reproduction and publication of headlines as well as short extracts, and the use of Google's cache, the publicly available data storage of articles and documents, violate the law on authors' rights.""



still more wild on the future commentary ;-)

acroyear acroyear writes  |  more than 11 years ago

Two parts to "intelligence" by human standards

There's really two things to consider in producing another creature like humans that visibly demonstrate their intelligence by manipluting their environment to suit them and assert some mastery (domestication) of lesser animals.

One of course, is actually having the brains and a complex means of communicating ideas besides "bad", "good", "i like that" and "i fear that". The other is the means of manipulating the environment, and tools. And we still don't clearly understand how this happened to us to be able to properly predict how another spinoff creature might reach the same point. We see signs of growing tool use in other primates now, but much of that is a result of them watching us do it ("monkey see monkey do"), and as such can't be relied on to make the type of predictions the show was aiming for.

Certainly ant and bee colonies are a clear demonstration of how an individual in a group can be an idiot and yet through signals sent by messenger individuals (utterly unaware of what they're carrying) the colony can show massive awareness and adaptation to situations in its environment. In other words, the colony shows signs of consciousness and individuality even as the individual insect shows none. The special decided to look at the extensions of such colonies in jellyfish and spiders.

I think another fault might have been ignoring the fact that our metal and buildings and trash would still be in some parts of the would have been nifty to see what might have evolved as a result of the situation of an abandoned city falling apart.


"It's not like all animals will grow really big if nothing killed them." -- not all, but some will, and if sticking your neck out gets you food others can't get, then long necks will likely evolve. The problem is that most creatures today eat grass (no need for a long neck unless the body is too tall like elk and moose) or climb up into the trees they eat from. For a turtle to evolve a long neck, it would have had to have found some reason to not have evolved grass-eating which is nature's current direction of things since grass is so prevelant (even giraffes can deal with grasses, though they prefer leaves).

"why would sharks suddenly have a need for bioluminescence" -- perhaps a lack of food on the surface of the oceans due to excessive heat or cold? Although some creatures (like cats) have been steadily gaining some degree of infrared heat-sensitivity to deal with the dark...

"remember how raptors hunt in packs and use logic" -- I still consider that speculation at this point...believable, but not enough to base an argument on...particularly not after having based a movie on it ;-)

"Also squid have existed for millions of years and have remained pretty much unchanged" -- I would agree with this assesment. some creatures have definitely shown an end-of-the-road aspect of their evolutionary existance (as in, they're unchanged from pre-dinosaur times) and I don't see nature trying again with the invertibrates as long as there are vertibrates who can eat them.


on the show's prediction of the elimination of mammals:

yes, the big cats are disappearing, mostly due to us hunting them or taking away their own hunting and breeding grounds, but the small cats, particularly the domestic cats, are tremendously well adapted hunters of small insects and rodents, and given that most mass-extinctions tend to take out the larger animals, I can't see the food supply of cats ever disappearing...and small cats aren't hunted by anybody (you don't get much eating a meat-eater, and nature knows that).

Right now, we're the biggest killer of cats (roadkill) and the biggest danger to their extinction (eventually we might get stupid and "fix" the last domestic cat and poof! no more cats).


Hitting digital limitations? Go Analog!

acroyear acroyear writes  |  more than 11 years ago In reply to this weblog entry on hitting DRM limitations, I wrote

There's one way they can't stop you, though there's a minor loss in quality (give or take...most heavy-compression audio formats already are lossy, so it really can't get any worse): run it through an analog mode on the way into a computer. Music won't really be "CD-quality", but for spoken word, who's going to notice. And this works fine for me when I'm making (personal use only) CDs from DVD and VHS concerts (that I own -- no piracy here you RIAA jackasses).

Basically, what you do, requiring either 2 computers, or an intervening high quality tape recorder or the phillips cd-recorder thingy, is wire one computer's "speaker" jack (not the headphone jack, as that's amplified and will distort) into the input (again, not the amplified "mic" jack) of another computer, play with the volume and input settings 'til you don't see red, then with appropriate software (n-track studio works for me on my win98 laptop @ home) record it into .wav files. With something in .wav mode, you can do anything you want. Remember to record into 16bit, 44.1 stereo to match the CD rating.

If you don't have 2 computers handy, you can use a tape recorder with Dolby-B to have a temp copy that you play back into the computer, or a digital tape recorder will suffice even better, but again, using the analog i/o lines instead of trying an all-digital solution. Keep in mind that an audio cd holds 650-700 (74min or 80min) plus of space, so you'll need a lot of free disc space, on a single partition, to make the .wav files.

Mind you, with any CD already cut, you can copy that independently, so the whole "you can only make one cd out of this" is a stupid and useless. But I read and noticed you already noted that flaw in their logic...

It really comes down to this: any one audio package by itself isn't enough to do all the audio you want to do, in order to stay DRM compliant, but the right combination of tools, and the willingness to pass through an analog mode on the way, will allow you to do anything.


More comments on nature/future

acroyear acroyear writes  |  more than 11 years ago

Large Animals, Little Food, happened before...

The indricothere, of the Rhino family, managed to survive for a few cool million eating from rare trees in scrubbrush land, dealing with wet and dry seasons much like the current african savannah (only without the grass).

However, there are enough grass-eaters out there that I don't think evolution's going to produce new tree-eaters except ones that eat fruit as part of a omnivorous diet (like primates today).

Nature's produced grass-eaters (including ourselves, though we can only deal with cooking the seeds -- wheat and barley are grasses) in quite a few mammal and lizard & salamander families, and I don't see the grass-eaters giving up and tree leaves being the core of the future diet again. Tree leaves are a luxury item, like giving carrots and/or radishes to a rabbit, but more mammals still need a steady regimen of grains.


Evolution requires mutation, and THAT's what's unpredictable

acroyear acroyear writes  |  more than 11 years ago

A comment I just left in the "Future Is Wild" boards:

That Darwin's theory explains why things are the way they are, with regards to survival, it doesn't explain the HOW, which is mutation. Mutations occur and natural selection drives the duplication of the mutated genes 'til a new species is differentiated from the old.

However, the nature of how mutations really happen, and how "good" ones that are "prefered" arrive (as we're very keyed in to hating anything "different" ourselves and often shun it in humans or kill it in animals) is what we as humans have not been able to truly see or test. Its hard to test, as mammals have too long a breeding period, and colonial insects (ants and bees) are usually dominated by the queen's genes. Most genes that change behaviours tended to have already been on the planet somewhere, and are only spreading now because we're accidentally spreading them (e.g., "africanized/killer" bees).

The show did a good job of suggesting what natural selection might do, given a set of mutations over X million years to produce said animals, but the fact is that the mutations themselves are what's utterly unpredictable...and truth be told, rather boring by comparison to the end-results we saw.

I consider evolution a fact, but not a law in the Newton/Einstein sense, because evolution can't be used to predict the future with any accuracy since evolution doesn't explain mutations; it only relies on them. It would be like trying to use Einstein to predict something in electrons without the use of calculus.


Inventing the Future is Wild

acroyear acroyear writes  |  more than 11 years ago Not a bad show, though I personally disagree that the mammals will be extinct in 100 million years. Even as substantial a hit that killed the dinosaurs wasn't enough to eliminate ALL reptiles or birds, and the little ones certainly got big within 10 million years (crocs and gastornis, e.g.) as did mammals coming up from basic rodent-like things. Certainly, I can accept that some subsets will die out like many already have, but I don't forsee all of them going. If insects and fish will continue to thrive and get tougher, mammals will get tougher just like they always have.

The series seems to imply that organized societal creatures (jellyfish, spiders, like ants today) will be the big survivors and that individually-thinking creatures won't make it, and this desire to limit the effect and adaptability of the individual intelligence I feel is misdirected.


Who Voted for JarJar?

acroyear acroyear writes  |  more than 11 years ago In IMDB the poll today (Dec 27, 2002) had the question of "which mystery in a 2002 movie would you like to know the answer to?", with things like why does Dumbledore pick such lousy faculty (actually, in books 3 and 4, the dark arts teachers are great...but then again, the predictions teacher and the history guy who only talks about goblins revolutions is a bit much). The current winner as of 9am is "who voted for JarJar to be in the Senate".

Sorry, guys, but that's an easy one. As a result of the new alliance/union of the Gungans and the Naboo, the Naboo offered one of their seats to the Gungans. The Gungans voted for Jar Jar. Unanimously. To get him off the damn planet.



Past Slashdot comment on Java and C++

acroyear acroyear writes  |  more than 11 years ago In reply to someone's comment saying "Java's a gorgeous language [...], it's the language itself that won developers' hearts", I wrote:

Actually, the language itself isn't the greatest. It won over c++ developers because of 3 reasons : a standard network library, closely tied to a (reasonably) well-designed, truly OO I/O library; a standard GUI api that worked across platforms, and considerable simplicity and code-readability over full-effect-templates in C++ (read Alexandrescu's Modern C++ Design recently?).

A fourth capability, in the first released version of java, but not used to its fullest capacity until Java2, was the use of "interface" over abstract base classes as a means of building frameworks in the newer library components such as JDBC, Collections, and XML. C++ always had this, but "interface" is a much simpler syntax and means of expression verbally over "abstract base classes with only pure virtual functions" (the C++ version of a Java interface). Also, newer C++ libraries and designs tend to rely on templates and traits to enforce an interface (ala generic programming) rather than class design, because its easier to just write a class than to design a hierarchy -- java's "interface" took the hierarchy out of places it didn't need to be).

Now, one of those ended up an utter failure (AWT), and its replacement (Swing) though amazingly more successful as far as design and power goes, is as noted dog-slow (though its something that does get faster as machines get faster; moore's law does help Swing considerably). I personally love swing just because (when used properly) the WORA DOES work (layout management is the #1 problem for almost every bad interface out there, and that's not unique to swing; i recall a lot of bad motif layouts too from windows programmers not used to Xt's approach); also, the power in using renderers for complex components and dividing up responsibility of showing the look vs managing the data, is something i miss in any other gui library out there.

The second, the standard networking library tied to the I/O library, remains its brilliant point and the basis for Java's most successful libraries and projects, including all its server-side work. Bjarne is most impressed that the standard socket + stream library that works on ALL platforms (its the one that's most reliable in that respect of WORA) that he's planning to propose a standard socket interface to C++, though I think its now too little too late. I'm not saying that and aren't flawed. The use of abstract base classes for Socket and URLConnection, which likely dates to before the interface keyword was introduced is a "bad thing"; java.util.Dictionary was like this as well, but at least that's been deprecated out. Similarly, java.nio addresses most of's problems, but at the considerable cost of code simplicity; if you need java.nio, it'll take a lot of time and work to use it correctly, and few books and articles are really making it clear when you actually need it. In the early stages of new technology, the "how do i use it" well-buries the "when do i need it" question.

The language itself has remained simple, with only 2 partially-incompatible changes over the years (inner classes including anonymous, and assert), and this may be its one saving grace against C# (which has a more complex syntax, but currently a much simpler library based on a cleaner syntax to most of MFC -- that will change in the future as M$ will always code-bloat their products).

The "interface" syntax is to me still Java's most powerful feature; again not in that it provides any more capabilities over C++'s abstraction (as i worded it above), but by being so simple, did more to improve inexperienced developer's OO code than any other OO syntax out there (IMHO). I'm not suprised at all that C# also chose to keep the interface keyword.


My Slashdot comment on Aragorn's story in Jackson's LOTR

acroyear acroyear writes  |  more than 11 years ago Well, I think many are misreading Jackson's take on the Aragorn story a bit. The development from young man who wants nothing to do with responsibility and kings and crowns and gondor and just wants to hang out in the north with his ranger buddies and occasionally come into rivendell and sweet-talk Arwen, into a mature responsible leader ready to fight the worst of the worst and rule the entire free world (in kindness) IS in the book...its just all done in 3rd-party recollections and in appendix A; that is, its already happened before Frodo meets him. It IS in Tolkien's story.

What is different in Jackson's is that instead of it having already taken place in the past, where the Aragorn they see at the Council of Elrond is all ready to take his place (with his only personal fault being the breaking of the fellowship at amon hen, quickly forgotten when Gandalf returns), the transition from loner to leader is taking place before us.

Had Jackson not done that, there would be no character development in him or most of the non-hobbits at all.

Read the book again, specifically looking at the words from Elrond and Denethor on him, and in appendix A, and you'll see that transition: Denethor's Aragorn is not the one the hobbits met in Bree. Aragorn in the books has already matured to leadership, where the Aragorn in the movie is actively maturing before us.

I for one think Jackson's version works just fine, as the alternative while a good book character would be a rather flat part in a movie.

Got a "5" :)

Later on...

Having seen TTT now, I have to followup my post here a bit. Yes, as a whole, Jackson's take on the Aragorn story is generally true to Tolkien, just bumped up to taking place during the war for the ring instead of before it...but some of the exagerations in TTT (falling off a cliff, and Arwen actually prepared to leave middle earth) i could have done without...


Occasionally, I know movies...

acroyear acroyear writes  |  more than 11 years ago In another mailing list, talking on ST:Nemesis, some chap suggested that there will be another ST movie 'cause it "makes a lot of money" for Paramount. In reply, I wrote:

however, only having a $27 mil opening week for a 100+ mil film isn't exactly "a lot of money"...and there's no followup. LOTR:TTT is gonna trounce on everything until christmas, and well into next year. Paramounts only (though probably reliable) hope for that film to break even is home video and HBO licensing. Insurrection and ST-V both became profitable (barely) to that market.

some chap wrote in reply:

LOTR us a red herring here. You can't say that one movie sucked simply because another movie is likely to do much better. They're completely different kinds of movies, with different styles, different actors and different production companies. There's no basis for comparison other than the fact that both are opening within a week of each other.

my word at the time:

I wasn't asserting that Nemesis sucked because of the upcoming TTT release, only that TTT is going to sell so well that everybody else is going to be off the map entirely, relatively speaking. Yes, Nemesis may still be #2 or #3, but the dropoff between week 1 and week 2 is going to be FAR worse than the usual 30-50% The bad reviews and a bigger hit coming out combined will knock Nemesis for a loop this next week. It won't be profitable, much less "a lot of money", until the video/dvd, if at all.

My final word on the subject: Lord of the Rings : $61.5 million over weekend (total, 100mil)
Nemesis : last weekend 18mil, this weekend, 4.4 mil. a drop over 70%.

So I think I was right here... I predict Paramount won't bother, as this one was supposed to have captured the "magic" of ST2:Khan, whatever that is that they seem unable to duplicate. If Paramount can't figure it out, and it loses them $100+mil each time they fail, I'm guessing its time to leave the movies alone...

But then again, I don't run a studio, do I?


acroyear acroyear writes  |  more than 12 years ago Used to be the groucho one, but now i've changed to one from a first-season simpsons episode.

Oh yeah, and I'm back @ 50 karma again ;-)

Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>