Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The End of Native Code?

Cliff posted more than 8 years ago | from the maybe-not-JITs-yet dept.

1173

psycln asks: "An average PC nowadays holds enough power to run complex software programmed in an interpreted language which is handled by runtime virtual machines, or just-in-time compiled. Particular to Windows programmers, the announcement of MS-Windows Vista's system requirements means that future Windows boxes will laugh at the memory/processor requirements of current interpreted/JIT compiled languages (e.g. .NET, Java , Python, and others). Regardless of the negligible performance hit compared to native code, major software houses, as well as a lot of open-source developers, prefer native code for major projects even though interpreted languages are easier to port cross-platform, often have a shorter development time, and are just as powerful as languages that generate native code. What does the Slashdot community think of the current state of interpreted/JIT compiled languages? Is it time to jump in the boat of interpreted/JIT compiled languages? Do programmers feel that they are losing - an arguably needed low-level - control when they do interpreted languages? What would we be losing besides more gray hair?"

Sorry! There are no comments related to the filter you selected.

What else (3, Funny)

Anonymous Coward | more than 8 years ago | (#15520725)

We might be loosing our ability to spell the verb "lose".

No, wait, too late.

On the subject of loosers... (3, Funny)

Kelson (129150) | more than 8 years ago | (#15520755)

No, no, obviously, they're loosing grey hair in the same sense that one "looses the dogs" -- i.e. they're setting the grey hair free.

Re:On the subject of loosers... (3, Funny)

Anonymous Coward | more than 8 years ago | (#15520816)

Thank you!

I was beginning to think I had gone mad, or perhaps there was committee that changed the spelling of "lose" without telling me. I honestly haven't seen anyone spell it correctly in months. It's starting annoy me as much as people can't tell they're from there from their.

Re:On the subject of loosers... (1, Insightful)

Anonymous Coward | more than 8 years ago | (#15521042)

> It's starting [to] annoy me as much as people [who] can't tell they're from there from their.

I'm not from there. I'm sorry if it annoys you, but, really, I'm not.

(You should probably put some quotes in there somewhere).

It's usually USians who get these things wrong, of course. Those I've talked to really don't seem to care much and explain it away as "evolution of the language".

I guess there's an argument there somewhere, but I look to history and find 'reports' of 'people' (iirc, Webster) deliberately changing the language - not out of ignorance, but just to piss off the English. It worked - and is still working.

Just another example of 'Embrace and extend' - part of the US culture (is that an oxymoron?). The USA is the 'Microsoft' of the English language - copy it, then change it so that it doesn't work properly with the original version. ... and you lot complain about China.

If I had the opportunity to change the language, I would make more obvious changes, like remove silent letters.

Wow - you actually used 'correctly' correctly! Awesome! Obviously, you're not an Apple user, else you'd spell it different.

BTW, for anyone who's interested, it's not "British English", it's just "English". The Scotts, Welsh and Irish each have their own language - some even use them.

Re:On the subject of loosers... (2, Interesting)

Anonymous Coward | more than 8 years ago | (#15521130)

If I had the opportunity to change the language, I would make more obvious changes, like remove silent letters.

Well, what do you think we've been doing over here? Just that.

We've taken out useless letters, such as colour -> color and catalogue -> catalog. We've simplified superfluous pronunciations, as in aluminium ->aluminum. And we've made the number system more consice and practical, for example thousand million -> billion.

There's much more work to be done; for example, every word that contains an "ough" needs to be reworked. But at least one country has taken up the initiative and made the first few steps towards a more rational language.

Re:What else (-1, Troll)

Anonymous Coward | more than 8 years ago | (#15520910)

I want to sniff some ASS-PANTIES!!1!!!!!!!1~11

What?!?!? (0, Troll)

Jozer99 (693146) | more than 8 years ago | (#15520734)

Has the author of this article ever USED any non-native code powered applications? Stuff written in Java and .Net (especially Java) runs like crap. Even with a fance new dual core processor and gigs of RAM, running a simple non-compiled word processor will bring your system to a standstill.

Re:What?!?!? (4, Funny)

Kethinov (636034) | more than 8 years ago | (#15520750)

I know what you mean. In Linux, I used tons of music apps like Banshee and Amarok for their features, but got fed up and went back to XMMS for its speed. JIT languages are NOT appropriate for every task.

Re:What?!?!? (0)

Jozer99 (693146) | more than 8 years ago | (#15520772)

They aren't very good for ANY task as far as I can see. Playing an MP3 in native compiled code requires a 15 MHz processor if you don't optimize too much, or about 1% - 2% of a modern processor. Playing an MP3 in non-compiled code will eat up all the cycles on a modern P4 or Athlon 64 processor.

Re:What?!?!? (4, Funny)

bunions (970377) | more than 8 years ago | (#15520791)

They aren't very good for ANY task as far as I can see.

fun fact: slashdot is written in an interpreted language (perl).

wait a minute, the kid might be onto something ...

Re:What?!?!? (1)

Ambush Commander (871525) | more than 8 years ago | (#15520867)

Well, server-side applications and client-side applications are, quite arguably, different sorts of programs.

Re:What?!?!? (1)

bunions (970377) | more than 8 years ago | (#15520931)

They aren't very good for ANY task as far as I can see.

I don't see the part where he says "they aren't very good for ANY client-side tasks". And more and more traditionally client-side applications are moving server-side, what with this new-fangled interweb and your ajaxes and such and such.

Re:What?!?!? (1)

Ambush Commander (871525) | more than 8 years ago | (#15520973)

I'll give the parent post the benefit of the doubt. The article text is worded in a way that strongly suggests client-side.

Re:What?!?!? (0)

Anonymous Coward | more than 8 years ago | (#15520805)

Fun fact: Even in non-native apps, the playing itself is usually done by native libraries. (GStreamer, Xine, libmad, etc.)

The non-native code just drives the UI.

Re:What?!?!? (1)

Kethinov (636034) | more than 8 years ago | (#15520831)

Fun fact: Even in non-native apps, the playing itself is usually done by native libraries. (GStreamer, Xine, libmad, etc.)

The non-native code just drives the UI.
Yet these apps still incur a severe performnace hit. Clicking the "next song" button in Quod Libet (a Python app) for example takes up to one or two seconds to load the next song for me. XMMS is instant.

Re:What?!?!? (1)

Kethinov (636034) | more than 8 years ago | (#15520813)

They're appropriate for many tasks. Web development is a good example, along with productivity apps that don't depend on speed. An alarm clock app, or post it notes, or other such small things. But apps making use of binary media or requiring extensive math such as a physics engine on a game or an emulator should use native code.

Re:What?!?!? (2, Informative)

Midnight Thunder (17205) | more than 8 years ago | (#15520869)

For heavy number crunching interpretted is not there yet. Then again compiled code is not great for everything either, sometimes you have to write hardware specific assembly code. On the other hand if you are doing something that spends more time waiting in the user than actually working, then an interpretted language is great. Like everything you need to make a choice when you are going to use one approach or another. Remember you want to get a program out of the door and making it available to as wide an audience as possible so you will write in the language that is best suited for the task. If you know where to delegate the heavy number crunching then you can spend the rest of the time making a great program. Look at some of the programs written in Java using OpenGL. They are fast, but that is because they hand off the necessary heavy lifting to low-level APIs, heck even compiled languages like C/C++ do the same. See here for example: https://jogl.dev.java.net/ [java.net]

Re:What?!?!? (0)

Anonymous Coward | more than 8 years ago | (#15521029)

They aren't very good for ANY task as far as I can see.

Doesn't sound like you can see too far, and you're not helping anyone else see any further either.

Why not contribute some actual information. Take whatever c mp3 library you used in your "tests" and get it compiling with Portable.NET's C compiler [southern-storm.com.au] . Then benchmark it against the natively-compiled version and post the results.

If you wanted to get wild, you could even run a profiler on the results and let the pnet guys know about any particularly slow bits so they can address them.

Amarok? (1)

Ender Ryan (79406) | more than 8 years ago | (#15520930)

~ $ file /usr/bin/amarok /usr/bin/amarok: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), for GNU/Linux 2.6.6, dynamically linked (uses shared libs), stripped

~ $ file /usr/bin/amarokapp /usr/bin/amarokapp: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), for GNU/Linux 2.6.6, dynamically linked (uses shared libs), stripped

amarok-1.4.0/amarok $ ls src/
*snip* a bunch of C++ source and header files.

Come again?

Re:Amarok? (1)

Kethinov (636034) | more than 8 years ago | (#15521061)

Yeah, I realized that thing about Amarok after I posted my comment, but the criticism is still valid. It may not be written in an interpreted language, but it's still ridiculously slow. And it crashes a lot.

Re:What?!?!? (0)

Anonymous Coward | more than 8 years ago | (#15520934)

You do realize Amarok (and maybe Banshee, never used it or read much about it) is written in C++ (its a KDE/Qt app). Though in my case I prefer Amarok over XMMS (XMMS sucks for managing a large amount of music).

Re:What?!?!? (1)

Kethinov (636034) | more than 8 years ago | (#15521086)

Amarok is native, but still hideously slow (and crashes a lot). Bashee is a C#/Mono app. Quod Libet, another good example, is a Python app. Point is, a lot of modern media apps are written with features but not performance in mind, and many of them are done in interpreted languages. The kings of all music apps, iTunes and Winamp (or XMMS for Linux) do not suffer from the said performance problems.

Re:What?!?!? (2, Interesting)

tomhudson (43916) | more than 8 years ago | (#15520780)

It's all bs.

15 years ago I benchmarked assembler vs c for graphics code - c was 200 x slower. There is NO way that any interpreted runtime will even begin to approach the "bare metal", never mind c.

Most of the benchmarks crowing about the speed of JIT compilers ignore the startup and initialization time, as well as the end-run time.

I couldn't believe some of the naive assumptions on one published benchmark - they had the java code print out its start and end time and said "see, only 4x slower than c"; naive is being polite. Proper benchmarking would mean putting a wrapper around both code examples, to handle the start and end time notification.

And yet? (1)

LordoftheLemmings (773163) | more than 8 years ago | (#15520834)

And yet even though c was 200x slower most people now program in C or a "higher" language, why? Portability and ease of use. Isn't that what interpreted languages offer us more of?

Re:And yet? (1)

tomhudson (43916) | more than 8 years ago | (#15520974)

... and lots of embedded assembler.

Use C as a wrapper, embed your custom assembler for each platform inside each function, and you've got portability AND performance.

that's how its done. And that's not going to change. Besides, at the rist of bringing this back on-topic (they mentioned the requirements of Vista) how "portable" is Windows nowadays? Oh, right, its the least portable mainstream OS there is.

"Managed code" sucks. And the runtimes for those "managed code" languages are written in ... wait for it ... native code. Can't run that "managed code" without a runtime.

Re:And yet? (1)

martinultima (832468) | more than 8 years ago | (#15521144)

The only problem with that is, if you wanted to port to a new system, you'd either need to (a) write a new assembly-language routine for the new platform's CPU architecture, or (b) have a platform-independent C/C++ fallback which you could use on non-native platforms, kind of like a lot of libraries like imlib2 and SDL do for processor-intensive code – sorry if I'm butchering the terminology ;-) While assembly language runs code fast, C/C++ makes writing code fast, at least if you know C/C++. It's really the same thing as with X and fancy video drivers – while there's nice direct-renderer drivers for the more popular video cards, there still has to be a basic one that goes through various X calls for the cards which they don't have direct rendering on. Kind of a loose analogy, but I think it makes sense, anyway.

Disclaimer: I know enough C to write a basic "hello world program" and that's about it... I'm a Linux distribution maintainer, so I'm a lot better at building software than writiing it.

Re:What?!?!? (1)

NemosomeN (670035) | more than 8 years ago | (#15520909)

Maybe you sucked at writing C. I can't imagine that huge of a difference in performance unless you really just did something wrong. That could include choosing a bad thing to benchmark (Maybe something that triggered a compiler bug?).

Re:What?!?!? (1)

tomhudson (43916) | more than 8 years ago | (#15521005)

... are you really going to claim that ANY c spat out by a compiler can be faster than hand-optimized assembler? Especially when it comes to graphics code.

Think. As one example, the compiler makes assumptions about registers that I'm free to ignore. Ditto with pushes and pops, etc. There are shortcuts you can take in assembler that you can't in c. It doesn't get any faster than assembler. and assembler is probably the most beautiful language to work in, because its what YOU make of it.

Re:What?!?!? (1)

chris_eineke (634570) | more than 8 years ago | (#15521118)

Eh, sorry. Assembler is as beautiful as truck diesel engine. Raw horsepower, but rather useless in a Ferrari. If you want to see a beautiful programming language, how about one that allows one to express code as data?

Re:What?!?!? (0)

Anonymous Coward | more than 8 years ago | (#15520922)

The Java VM startup time wouldn't scale with total time the program runs for: It would be (approximately) a constant factor, and thus (asymptotically) uninteresting. Thus producing the time counts from within the language is valid. (Provided they made the same allowance for C.)

Besides which, Java is doomed to take far longer than C anyway: checking array bounds, checking validity of downcasts... etc. Oh, and the overhead of a decent GC :)

Oh, and optimizing compilers have advanced a bit in the last 15 years. :)

Re:What?!?!? (1)

tomhudson (43916) | more than 8 years ago | (#15521033)

The problem with the VM startup time is that a lot of code nowadays is "glue code". A small app starts up, shells out to another one to do a task, takes the returned data, shells out to another one .... all those new VMs get expensive.

the workaround is worse - 1 VM that starts up and never quits, just hosting each new process. In other words, 1 bad process can now bring all the processes down, so you have to add even MORE code to check that's not happening. Sychronize is already hugely expensive in Java ... this will add a whole new layer of "things to check between code points". The speed gains won't materialize, as the checking that has to be done increases faster than N+1.

Re:What?!?!? (2, Insightful)

Anonymous Coward | more than 8 years ago | (#15521105)

You seem very self-impressed. Fifteen years ago, before you were senile, I'm sure your benchmarks were of reasonable conclusions. Your C skills must have never been much to flaunt if they were compiling to a result 200 times slower than the "bare metal". I'm hoping for your sake that was exaggeration, but somehow I doubt it. The harsh reality is that working years of brilliant minds have pushed forward to develop compilers capable of pushing out code that would put your best work to shame. Optimizations you likely haven't thought of are regular passes in modern compilation.

JIT techniques, much like compiler optimizations, are developing day by day. Efficient caching techniques and dynamic recompilation and optimization are paving the way to an era of practical programming where an interpreted programming language can present comparable speeds to match against its compiled predecessors.

Like an old man harping about the "good ol' days", you are simply blinded by your years of experience and afraid of change. You cry "naivete" with respect to those who even suggest that there may be performance merits to interpreted languages while showing very little real-world modern figures or statistics. Naive! Naive! Naive!

Friend, blindly disregarding established JIT theory or advances in interpreted programming language performance sounds fairly naive to me.

Have you tried coding anything hard? (4, Insightful)

Anonymous Coward | more than 8 years ago | (#15520741)

When your web-based-datastore gets 50,000 inserts per second, hovers between 15 and 20 billion rows and endures a sustained query rate of 43,000 queries per hour, tell me which part of it you want to coded in PHP.

Re:Have you tried coding anything hard? (2, Insightful)

Hamled (742266) | more than 8 years ago | (#15520849)

The blinking lights, obviously!

But seriously, I like the Python model of "code the performance intensive parts in lower-level languages, code the rest in higher level languages that control it." If using an interpreted language would afford you more flexibility, more powerful language constructs, and faster development times, without much negative impact on parts of your system that are not performance-critical, it's almost a no-brainer to use them for those areas.

Re:Have you tried coding anything hard? (0)

Anonymous Coward | more than 8 years ago | (#15521078)

I too like the "just code the hard stuff native" approach. Script languages are darned comfy.

That said, the poster seems to come from client-centric universe where his PC is doing everything he does fast. But there is sooo much more to computing than the desktop.

I hope the cluesticsks he's about to be beaten with don't leave too much of a mark.

I would have been more fun if he's asked,

"What is a good rule of thumb as to when to code an app. in a native instruction set, and when to go high-level?"

Then we could make two piles for obvious disinction:

FPS engines
MP3 decoders
Kernels
Interpeters for high level languages
RDBMS
Seti@home

My CD roaster front end
My address book
A lightweight wordprocessor
Robot instructions
MythTV front ends

It seems to get really obvious where to draw the line....

Re:Have you tried coding anything hard? (5, Informative)

kpharmer (452893) | more than 8 years ago | (#15521092)

> When your web-based-datastore gets 50,000 inserts per second, hovers between 15 and 20 billion rows and endures a sustained query rate
> of 43,000 queries per hour, tell me which part of it you want to coded in PHP.

hmm, the warehouse I work on has multiple databases with billions of rows in them, can hit insert rates of 100,000 rows a second, can experience 60,000 queries/hour - many of which are trending data over 13 months, has hundreds of users. Many of these users are allowed to directly hit some of the databases with whatever query tool they want. Scans of a hundred million rows at a time aren't uncommon (though seldom happen more than a few dozen times a day).

This app is completely written in korn shell, python, php and sql (db2). Looks like Ruby is also coming into the picture now, will probably supplant much of the php in order to improve manageablity.

Oh yeah, and the frequency of releases is quick and it's defect rate is low. And we're planning to begin adding over 400 million events a day soon. I've done similar projects in C and java. Never anywhere near as successfully as in python and php.

We might consider rewriting a few select python classes in c. Maybe, if we port the ETL over to the Power5 architecture with psycho doesn't run. Otherwise, it's cheaper to just buy more hardware at this point - since each ETL server can handle about 3 billion rows of data/day with our python programs.

You are a moron. (1)

Shut the fuck up! (572058) | more than 8 years ago | (#15520745)

End of native code? pfffft. Prepare to be thoroughly owned.

-1 flamebait (4, Funny)

Anonymous Coward | more than 8 years ago | (#15520753)

Didn't we already do this with lisp, like 40 years ago?

Re:-1 flamebait (1)

Ant P. (974313) | more than 8 years ago | (#15520950)

Yeah, someone built an entire self-hosting operating system around a Lisp interpreter. The name escapes me at the moment...

Re:-1 flamebait (0)

Anonymous Coward | more than 8 years ago | (#15521073)

symbolics

Re:-1 flamebait (3, Funny)

Karma Farmer (595141) | more than 8 years ago | (#15521142)

Noo... I think he meant the other lisp operating system.

Emacs.

Re:-1 flamebait (1)

Karma Farmer (595141) | more than 8 years ago | (#15520961)

Didn't we already do this with lisp, like 40 years ago?
No, because people who know Lisp also understand that "native code", "interpreted", "interactive", "virtual machine", and "low level" all share a "loose" relationship, but are not interchangeable ideas.

LISP, BASIC, FORTH, P-Code, Java+Netscape (4, Interesting)

billstewart (78916) | more than 8 years ago | (#15521031)

LISP was a simple, elegant language that demonstrated that almost any language written after 1961 was unnecessary, except for demonstrations of concepts like Object-Oriented programming that could then be re-implemented into LISP, and that any code written in older languages could be replaced with something better :-)

BASIC had its problems, warping a generation of programmers (including me), but it was small and light and didn't take long to learn unless you wanted to enough find tricks to get real work done.

FORTH was smaller, lighter, and faster. It was overly self-important, considering its reinvention of the subroutine to something new and radical, but if you wanted to program toasters or telescopes it was the language to use. Postscript was somewhat of a Forth derivative.

P-Code was a nice portable little VM you could implement other things on.

And then there was Java, which grew out of Gosling's experiences with NeWS, a Postscript-based windowing system. If you wonder why you're not using Netscape and maybe not using Java, and why you've probably got Windows underneath your Mozilla, it's because it became obvious to lots of people that Netscape+Java was a sufficiently powerful and easily ported environment that the operating system underneath could become nearly irrelevant - so Microsoft had to go build a non-standards-compliant browser and wonky Java implementation and start working on .NET to kill off the threat. It wasn't that conquering the market for free browsers was a big moneymaker - it was self-defense to make sure that free browsers didn't conquer the OS market, allowing Windows+Intel to be replaced by Linux/BSD/QNX/MacOS/OS9/SunOS/etc.

Its inevitable (4, Insightful)

greywire (78262) | more than 8 years ago | (#15520754)

As the overhead of interpreted languages gets smaller (through faster systems, JIT, and other optimizations), its inevitable that eventualy we'll all be using one (unless you are one of the few people who have to program the virtual machines, the JIT compilers, etc).

And this is a good thing, because it means more independance from certain CPU architectures.

Someday, you will be able to use any OS on any CPU and any Application on any OS. This is one step in that direction.

Re:Its inevitable (1)

koreaman (835838) | more than 8 years ago | (#15520782)

What in the world will be the point of the OS then?

Re:Its inevitable (3, Insightful)

lbrandy (923907) | more than 8 years ago | (#15520917)

As the overhead of interpreted languages gets smaller (through faster systems, JIT, and other optimizations), its inevitable that eventualy we'll all be using one (unless you are one of the few people who have to program the virtual machines, the JIT compilers, etc).

This cracks me up. As we head towards multi-core and massively-multi-core, you are telling me that things are going to get better for interprative languages? Compilers are about to kicked in the pants because we can only do thread-level-parallelism for so long (uh, lets say N=4.. when do those quad-cores come out? Early 07?).... The trend towards parallelism is going make compilers infinitely more importants as (memory bandwidth + computation bandwidth) gets the additions of a new, third term... communication bandwidth. It's just one more thing compilers are going to have to learn to deal with... and interpretative languages are going to fall farther behind... not catch up.

There will come day where we expect our compilers to encode parallel information into the code so it will run faster on our 1024 core machines. Interprative languages are going to be struggling to do that "just-in-time" like they struggle to do any optimizations now "just-in-time".

Re:Its inevitable (1)

blincoln (592401) | more than 8 years ago | (#15521091)

There will come day where we expect our compilers to encode parallel information into the code so it will run faster on our 1024 core machines.

Um, like using thread pooling? If you mean a compiler that takes simple linear code and magically makes it run faster on a massively parallel architecture, I'd be very interested in an argument for how that's even logically possible.

Re:Its inevitable (3, Insightful)

David Greene (463) | more than 8 years ago | (#15521129)

There will come day where we expect our compilers to encode parallel information into the code so it will run faster on our 1024 core machines.
Too late. That day has been around [cray.com] for 20 years already.

Negligleable performace hit my... (3, Informative)

MBCook (132727) | more than 8 years ago | (#15520763)

Have you ever USED a Java application or applet on Windows? Once they launch they perform pretty good. Once they launch.

On every computer I use with Windows it takes up to 20-30 seconds to launch Java. Web page have a little "yes, you have Java" applet? Prepare for a massive slowdown. I'd hate to see what it does with applications that may just appear to hang the computer while Java launches. And don't get me started on taht stupid "Welcome to Java 2" dialog that pops up from the taskbar.

Now on my Mac, things are different. Java applets launch just as fast as Flash if not faster (basically, instantly). This is on my G4 so things would only be better with a CoreDuo. Same goes for applications. I've been using an appilcation called YourSQL for over a year. It accesses a MySQL server and works great. It's very fast, has a perfectly native interface. You would think it is a native app, but I recently discovered that it's Java. The end use would NEVER notice that kind of thing except I was trying to debug a problem in my own code so I went to invesitage how it worked. It was Open Source and when I downloaded it... it was Java.

Java is fantastic on Mac OS X. I don't know how fast it is on Linux. But as long as there is a 20-30 second launching penalty on Windows, Java will never be accpeted. I don't think .NET has this problem, but probably because MS is keeping it memory resident in Vista even if no one is using it.

Then again, maybe Mac OS X preloads Java. I don't know if it has tricks, or if the Windows implementation is just that bad.

Java and Mac OS X (4, Informative)

Kelson (129150) | more than 8 years ago | (#15520871)

Mac OS X treats Java as just another app framework [apple.com] , equivalent to Cocoa or Carbon. (I'm fairly certain I've seen an older version of that diagram that also listed Classic in that layer.) I imagine they've done a bunch of optimizations to tie it into the system, though I don't know whether it launches the runtime at boot or not. You've probably noticed that on Mac OS, you get your Java runtime from Apple, not from Sun or IBM.

The downside is that things don't work quite the same as they do in Sun's Java runtime, so there are differences between Java-on-Windows and Java-on-Mac. For instance, my wife is an avid Puzzle Pirates [puzzlepirates.com] player, and the game client is a Java app. There've been Mac-specific bugs in the past, and at one point a major slowdown appeared when the game was run on a Mac. It hasn't been fixed, so while she can still do crafting on the Mac, whenever she does anything multiplayer, she has to switch to the Windows box.

Re:Java and Mac OS X (1)

BigCheese (47608) | more than 8 years ago | (#15521053)

I love Puzzle Pirates too. It runs wonderfully on my Linux box with the Sun JVM. From what little I've seen Bang Howdy (also from Three Rings) is even more impressive.

Puzzle Pirates made me rethink what you can and can't do in Java.

Re:Negligleable performace hit my... (1)

Eideewt (603267) | more than 8 years ago | (#15520878)

Of course, that problem goes away if the VM is pretty much always in use, since it's been loaded already. Especially if the whole OS is coded on top of it. Then you do have the problem of it always sucking resources though.

Re:Negligleable performace hit my... (0)

Anonymous Coward | more than 8 years ago | (#15520880)

Part of the reason is that Apple has heavily modified the Jave JRE to the point that it's pretty much their own implementation (notice how there's no Java OS X download on java.sun.com). This is also the reason why most Java programs really look like they fit in with OS X. Microsoft had a pretty quick JRE written for Windows some time ago, but I think we all know how that turned out. :(

Re:Negligleable performace hit my... (4, Informative)

NutscrapeSucks (446616) | more than 8 years ago | (#15520895)

Let me add some content to your Apple advertisement :)

Apple's JVM implementation has something called Class data sharing [javalobby.org] to speed Java startup after the first invocation. The first time is just as slow as always. Since then the feature has been added to Sun Java 1.5, so if you're up to date, you should have this.

ahh (1)

Billly Gates (198444) | more than 8 years ago | (#15521003)

I took java last spring for a MIS course and I was expecting the usual slow load times with things like netbeans. The class was taught with java 1.5 and I noticed it was fast. All the java applets just loaded including the java programs netbeans on my system.

Oddly I wondered why Azuerous and frostwire seemed to not be that slow anymore as well. I figured I just assumed java didn't suck as I thought it did.

Now I know why.

For the slashdotters reading this, I highly recommend upgrading to java 1.5 on your machine. You will certainly notice a speedup as I did.

Re:Negligleable performace hit my... (1)

MBCook (132727) | more than 8 years ago | (#15521019)

The computers I use with Windows and Java have 1.5.06 and it is still very slow. Once it gets running Java is nice and fast on Windows. But it just takes forever to start up. Even if Java was running 1/2 hour ago and I've used other stuff since, it's slow.

On my Mac, it never takes much time at all. It's quite quick. I may not use Java for 2 days (I don't turn my Mac off) but the next time I go to use Java it pops up quite fast.

I remember the MS JVM and how much faster it was. That was nice. Still, with Windows being such a HUGE platform you'd think they'd work on better performance for it.

Re:Negligleable performace hit my... (1)

MrSquirrel (976630) | more than 8 years ago | (#15521036)

What are your PC specs and your Mac specs? Proc, RAM, HD speed, OS, browser, anti-virus, anti-malware. All those things make a big difference in computer performance.

Re:Negligleable performace hit my... (1)

MBCook (132727) | more than 8 years ago | (#15521075)

I agree. But whether it's my mother's 2.16 GHz P4 laptop or one of the brand new P4 HT desktops at my school, there is a delay. It's slightly better on the newer computers. But on the older computers I have access to (and by older I mean early P4, late P3) the computers just DRAG when you start Java and can sometimes almost seem to be frozen. All of this is on computers that are quite responsive opening other apps and such.

Re:Negligleable performace hit my... (1)

tepples (727027) | more than 8 years ago | (#15521076)

My PC specs: 0.866 GHz PIII; RAM: 128 MB; OS: Windows 2000 with latest service pack; anti-virus: HouseCall; anti-malware: Spybot SD; bank account: small. What can I do to speed up the JVM?

Re:Negligleable performace hit my... (1)

MBCook (132727) | more than 8 years ago | (#15521090)

I really don't know, I'm sorry to say. I remember when I had to switch from the MS JVM to the Sun one. Things just slowed down SO MUCH when that happened.

I think it's the startup that causes people to say "Java is slow", because once you get INTO the app if it is written half-decently Java can be very peppy. It's just the launch time that kills it. It's just like it doesn't matter how fast the computers at your office are, if they take 1 minute to boot people will call them slow.

Try a Mac the next time you're at a store that sells them. You'll be amazed how fast Java starts. THAT'S how Java should be.

Can anyone comment on how fast Java is to start on Linux? I haven't messed with Linux in over a year, and Linux + Java in longer than that.

Re:Negligleable performace hit my... (1)

Billly Gates (198444) | more than 8 years ago | (#15520985)

That is because there are over 100,000 methods in the java api!

All those need to be dynamically compiled so the java applications can link to them. One of Java's best strength's is its api. You can look it up by going to www.java.sun.com and selecting javadocs?

I am sure perl or python would be even slower if it had that many api's to dynamically compile into bytecode.

Java is semi native and not %100 interpretted so it can be just as fast as C or close to it if the section of code is already compiled at runtime to the JavaVM that runs on your native processor. Perl or python are alot slower as more interpretation is used.

Not quite the end yet (2, Informative)

The Evil Evil Muppet (857282) | more than 8 years ago | (#15520767)

At this point, there is still a lot of development happening in "native" languages. Additionally, there are projects in motion to turn bytecode from environments like Java and Python into native code. One of the reasons a lot of people are seeing this seemingly massive movement is because of the technologies these "non-native" solutions leverage. Take both Java and .Net - the support libraries are huge and designed to (more or less) work together. All of that said, I'm a bit sick of either having to put up with the limitations of some of the languages that end up as native code or distributing some runtime environment with my app that immediately gives my users an impression of my product. For that reason, I've started to use D - http://blogs.itoperations.com.au/chris/software/la nguages/language-choice-is-a-compromise/ [itoperations.com.au] . If you can't be bothered reading my convoluted blog (there's more coming on the subject, along with a project release in coming weeks), go on over to the language's home - http://www.digitialmars.com/d/index.html [digitialmars.com] . It has the vast majority of C++'s features (and more), Java/C#-style syntax and ease whilst compiling to native code.

Re:Not quite the end yet (1)

bioglaze (767105) | more than 8 years ago | (#15521020)

You mistyped the URL. Here's a working one: http://www.digitalmars.com/d [digitalmars.com]

Re:Not quite the end yet (1)

Otter (3800) | more than 8 years ago | (#15521117)

You mistyped the URL.

Heh, I'm so accustomed to "You mistyped..." as the preface to Snotty (and usually misspelled) Nerd Sarcasm, it took a moment to realize that you were being genuinely helpful!

Application! (1)

Daxster (854610) | more than 8 years ago | (#15520769)

You're going to get a lot of the same sort of responses now - lots of people arguing about a requirement that these non-compiled programming languages aren't suited for. Every language has a different purpose when its creator(s) decides what direction to take.

The answer is to use a mixture. (1)

the eric conspiracy (20178) | more than 8 years ago | (#15520774)

The old saw is to not optimize until you have to. Write in an interpreted language, but be ready to dive into native code when the need for speed arises.

Re:The answer is to use a mixture. (1)

ceoyoyo (59147) | more than 8 years ago | (#15520881)

Exactly. Choose an interpreted language that makes it as easy as possible to do that.

I use Python on OS X. It's quite easy to wrap some C or C++. Objective-C is wrapped for you (which, conveniently, includes all the OS X system frameworks).

There's no reason why you shouldn't write all your GUI stuff in an interpreted language. Using the native GUI system still seems to be a good idea though. Still, there are always going to be algorithms you want to run as fast as possible, meaning you're going to want to write them in something lower level.

Re:The answer is to use a mixture. (1)

davecb (6526) | more than 8 years ago | (#15520882)

Very much so!

many moons ago I was on a fast-text-search project, which used a pre-jit interpreted language (one of Per Brinch Hanson's), and had almost no assembler or native code whatsoever.

It was fast because it used fast hardware, a custom AMD bit-slice processor. We mixed medium-slow code for the boring bits and a read of a custom device for the time-critical part.

--dave

Embedded? (1)

tepples (727027) | more than 8 years ago | (#15521114)

Write in an interpreted language, but be ready to dive into native code when the need for speed arises.

OK, I'm developing for an 8-bit NOAC (6502 based) microcontroller clocked at 1.8 MHz. Does the need for speed arise here?

two things (4, Insightful)

bunions (970377) | more than 8 years ago | (#15520776)

(a) 'loosing': oh jesus christ
(b) the obvious answer is that native vs interpreted is basically simply the balance of developer cost versus cost of end-user resources (ram, cpu, users time). Interpreted code is getting faster every day, no matter what "OMG JAVA IS SO SLOW DUDE" geniuses on the interweb tell you, but there'll always be problem spaces where a 5% speedup pays huge dividends.

Re:two things (1)

blank89 (727548) | more than 8 years ago | (#15520857)

It doesn't take a genius to see that the same OpenGl code that works just fine in c or c++ is slower than molasses at the north pole in the winter when ported over to java.

Re:two things (1)

bunions (970377) | more than 8 years ago | (#15520896)

That's funny, because JOGL isn't anywhere near "slow as molasses". Not if it's written by someone who's paying attention, anyway.

It's not close to native code, of course, but then there's a lot of applications that don't need to be, which was my whole point. For instance, scientific visualization apps typically love to trade increased hardware requirements for decreased development time, because they have such a limited audience, comprised primarily of the programmers themselves. Speaking as a programmer, I can put up with GUI sluggishness in programs I write for myself if it means I never have to track down a memory leak.

Re:two things (1)

blank89 (727548) | more than 8 years ago | (#15520944)

There are plenty of applications where a bit of lag won't do any harm, but there are still enough programs that need the higher performance that native code is still a vital part of programming. For instance, 3d benchmarking utilities and games would have a heck of a time in java. Even being optimized in lower level languages and with the latest hardware, the latest games and benchmarks run out of resources (clock cycles, memory, etc...).

Re:two things (1)

bunions (970377) | more than 8 years ago | (#15520962)

yes. There are several types of applications, some of which are appropriate for interpreted languages, some not as much. thanks for pointing that out.

Re:two things (1)

blank89 (727548) | more than 8 years ago | (#15521004)

You're quite welcome.

Why isn't anything compiled natively anymore? (3, Insightful)

Crussy (954015) | more than 8 years ago | (#15520824)

Outside of introspection and like technologies there is no reason why code cannot be compiled natively. Linux users are aware of compiling java code natively, microsoft is working on a native c# compiler, so why is it that everyone else things it's still okay to compile to byte code or scripts. It's not; end of story. I do not like when every new processor that comes out is stomped on by new programs requiring more resources to do the same job. How many java programmers use runtime reflection or introspection? How many programs is it actually needed? If you're not using that, then you should compile natively. Just because Vista is wasting precious resources on it's silly aero glass, etc, doesn't make it right for everyone else too. What happens when someone tries writing a kernel in an interpreted language? Stage 3 bootloaders'll throw us into a JIT environment now. I could just imagine the efficiency there. Native languages are the way to go and we're in for big problems if they don't stay around.

Not so fast (1)

blank89 (727548) | more than 8 years ago | (#15520832)

There will always be native code because it is faster. When code needs to perform better, rather than be more flexible, people will always go back to Asm and other such low level languages.

Someone's been reading too many benchmarks (4, Insightful)

Xugumad (39311) | more than 8 years ago | (#15520835)

"Regardless of the negligible performance hit compared to native code"

Yeah... people keep saying that. Okay, lets take the benchmark I hear about most: http://kano.net/javabench/ [kano.net] "The Java is Faster than C++ and C++ Sucks Unbiased Benchmark". Unbiased my foot. "I was sick of hearing people say Java was slow" is not a good way to start an unbiased benchmark. Lets have a few more problems:

  • This is not Java vs C++. This is Sun's JDK 1.4.2 vs GCC 3.3.1 on a P4 mobile processor.
  • GCC is not a fast compiler, it's a portable compiler that happens to be fairly fast. A fast compiler might be something like Intel's own compiler: http://www.linuxjournal.com/article/4885 [linuxjournal.com]
  • Having proven that method calls take almost twice as long under G++: http://kano.net/javabench/graph [kano.net] - the author then several of the tests recursively ( http://kano.net/javabench/src/cpp/fibo.cpp [kano.net] ). When this benchmark came out, various people on /. managed to get around 1,000 times better perfomance (under G++) by switching to a fixed memory usage non-recursive implementation.


Regardless of the negligible performance hit compared to native code, major software houses, as well as a lot of open-source developers, prefer native code for major projects even though interpreted languages are easier to port cross-platform, often have a shorter development time, and are just as powerful as languages that generate native code.


Y'know, I think there's a reason for that...

Particular to Windows programmers, the announcement of MS-Windows Vista's system requirements means that future Windows boxes will laugh at the memory/processor requirements of current interpreted/JIT compiled languages (e.g. .NET, Java , Python, and others).


Y'know, a couple of decades ago I was running non-native applications on a 7Mhz system with 1MB RAM (my old A500). They were fast, but not quite as fast as native. I'm now using a system in the region of 500 times faster, in terms of raw CPU, and with 2,048 times more memory. And y'know what, non-native stuff is fast, but not quite as fast as native. Something about code expanding to fill the available CPU cycles, methinks...

As A Developer (3, Interesting)

miyako (632510) | more than 8 years ago | (#15520845)

Of the development I do, about 60% is in non-native code (mostly java) and about 40% is in native code (usually C++). What I have found is this:
Java is the language I use the most, and it's good for small programs. It's definitely noticably slower for large applications, but I don't think that's the big reason that a lot of developers don't like it. Swing is nice, but the problem with Java and a lot of other "modern" languages is that they try so hard to protect the developer from themselves and enforcing a certain development paradigm that the same features that make it really nice for writing small program end up standing in your way for large and complex application development. Looking at the other side of the issue, C++ is fast, it can be fairly portable if it's written correctly, and has a huge amount of libraries available. C++ will let you shoot yourself in the foot, but the reason is that it's willing to stand out of the way and say "oh really want to do that? ok...". This makes it easy to write bad/buggy programs if you don't know what your doing, but if you pay attention, have some experience, and a plan for writing the software, then C++ can be less stressful to develop.
Aside from a reasoned argument, I think a lot of developers are just attached to C/C++. I know that I just enjoy coding in C++ more than in Java. Not that Java is bad- and it can be fun to code in at times, but the lower level languages just give me more of a feeling of actually creating something on the computer- as opposed to some runtime environment.
Finally, one major reason to stick with C++ is that many interpreted languages aren't really as portable as they pretend to be. A language like C++ that really is only mostly portable, and then only if you keep portability in mind, can sometimes be more portable than other languages that claim to be perfectly portable and then make you spend weeks trying to debug the program because things are fouling up.

Re:As A Developer (1)

Chosen Reject (842143) | more than 8 years ago | (#15521096)

I for one love pointers. That is, I want to decide whether I do call by value/call by reference. I dislike Java's Call by value-result. If Java gave me the ability to play with memory and pointers, I'd still prefer c/c++ though.

Managed and Unmanaged Code... (1)

ndykman (659315) | more than 8 years ago | (#15520856)

Overall, I think the answer is yes. As far as .Net goes, there are a lot of advantages to having libraries and applications in a format that can be just-in-time compiled or pre-complied to native code on installation. Certainly, for MS it is a big advantage, as they don't have to target a lowest-common denominator of CPU features. Another advantage is that .Net does have pretty decent interoperability with native libraries. So, if you need the native performance, you can use a native library to try and gain some performance back without going totally native.

As for Java, I think it has a couple of issue that gives it a bad impression. Certainly, it's integration with native code could be improved (JNI is pretty hairy at times). Also, it seems that the current JREs don't have a lot of native tweaks to increase startup time. Certainly, I think Java could use a utility like ngen for .Net to precompile libraries to native code. Used correctly, it can improve cold start times.

Cross Platform not related to language (2, Informative)

Mr. Sketch (111112) | more than 8 years ago | (#15520862)

The choice of language does not determine if something is cross platform. It has more to do with the choice of toolkits. If you are using GTK or wxWidgets you are pretty safe for being cross platform. C/C++ are cross platform languages, but if you use MFC and COM, they're not.

Even if I use Java or C#, but don't use a cross platform toolkit (e.g. Windows Forms would not be cross platform), the application won't be cross platform.

It doesn't matter if the language compiles to byte code, if that byte code doesn't use a cross platform toolkit, it won't be cross platform.

What will we be loosing? (1)

scdeimos (632778) | more than 8 years ago | (#15520906)

I am not expecting to be loosing arrows from my bow any time soon, but I am hoping that the /. editors can stand to be losing some O's from the summary. :)

Re:What will we be loosing? (1)

shoolz (752000) | more than 8 years ago | (#15521012)

I too detest abysmal spelling mistakes like that. If geeks coded the way some writers write, we'd end up with... well... Windows.

Huh? (1)

Karma Farmer (595141) | more than 8 years ago | (#15520918)

Do programmers feel that they are loosing - an arguably needed low-level - control when they do interpreted languages?
Could someone JIT this into english for me? What does "low level control" have to do with "interpreted languages"? And how is either related to native code and virtual machines?

And wtf is "loose"?

Not today, not tomorrow. (2)

Perseid (660451) | more than 8 years ago | (#15520919)

Silly question. The answer is and will always be: No.

Commodore 64 BASIC was interpreted. Computers now are obviously powerful enough to run 64 BASIC code very quickly. Does that mean native code should have been abandoned years ago because technology advanced enough to allow C-64 code to run quickly? JIT code will always be slower than native code and because the complexity of both JIT and Native code programs will get more complicated as the technology advances interpreted code can never catch up.

GRAMMAR NAZI (3, Funny)

illuminatedwax (537131) | more than 8 years ago | (#15520920)

List of things you cannot loose:
- your gray hairs (unless you can command them somehow)
- control
- the big game
- your way

List of things you could be loosing:
- the hounds
- your belt
- an arrow
- responsibility

Loved the (0)

Anonymous Coward | more than 8 years ago | (#15521069)

examples.

OOOor he could have a twitch in his finger from drinking toooo much cooffee. I get thooose all the tiiimmme!

Re:GRAMMAR NAZI (1)

djdanlib (732853) | more than 8 years ago | (#15521140)

Actually, Mr. Nazi, you "loosen" your belt. But thank you for your brief list.

CPUs still have *A LOT* to evolve (3, Insightful)

mangu (126918) | more than 8 years ago | (#15520969)

Interpreted languages have been OK for a long time, for applications where performance isn't the most important parameter.


Now find me one CPU that can do a decent simulation of the physics of continuous media. Why isn't there any game where you ride a surfboard realistically? Why do meteorologists use the most powerful number crunching systems in the world to be wrong in 50% of the cases when predicting weather a week ahead?


And what about artificial intelligence and neural networks? Find me a CPU that can do a decent OCR, or speech recognition. What about parsing natural language? Why can't I search in Google by abstract concepts, instead of isolated words?


No matter how powerful CPUs are, they are still ridiculously inadequate for a large range of real world problems. When you go beyond textbook examples, one still needs to squeeze every bit of performance that only optimized compilers can get.

Re:CPUs still have *A LOT* to evolve (4, Insightful)

An Onerous Coward (222037) | more than 8 years ago | (#15521136)

You seem to be under the impression that these problems you cite display inadequacies in the hardware, rather than the software. But, in the words of some fictional professor from a book I can't remember: "If you speed up a dog's brain by a factor of a million, you'll have a machine that takes only three nanoseconds to decide to sniff your crotch." Given the current software and algorithms available, more computing power alone wouldn't solve any of the problems you describe.

My head's going to explode!!!!!! (0, Redundant)

ConceptJunkie (24823) | more than 8 years ago | (#15520992)

It's "LOSING"! L-O-S-I-N-G! One "O". Just one.

The opposite of "win" or "gain" is "lose".

"Loose" rhymes with "Goose". It can be used as a verb, but it means something different than "lose".

Geez, doesn't anybody read any frickin' books any more?! If you read books once in a while, you would actually learn to spell.

OK, you can mod me down, I'm done.

Well, yes and no (4, Insightful)

BigCheese (47608) | more than 8 years ago | (#15521023)

Don't you hate that answer?

Yes, we are seeing more development in non-native code but, it gets it's power from the underlying libraries and core code that is native. The line between them gets fuzzy when you toss in JIT and scripting to native code compilers. It really depends on the problem area. If I'm just parsing apart a bunch of log files to make reports Perl or Python would be the best. Web apps seem to benefit from the safety net of non native code but I'm sure there are exceptions.
OTOH there are plenty of apps that need all the speed and memory the machine can provide. My current job involves real time financial data delivery. Writing that in Python or Java would (probably) not work out too well. OS code that works directly with hardware will probably stay in assembler or C. Fast low level stuff is what allows the slower high level stuff to be useful.

Either way you still need to know what you're doing because in the end both native code and interpreted code run as opcodes on a CPU and use hardware resources. You need to mind memory use in Java just like C. Just in different ways. You've need to watch what you do in inner loops in both Python and C++. Linear lookups can cause scaling problems in Perl, Java, Python or C/C++.

It all depends on how fast you want to get from problem to solution, how much hardware you can throw at it, how complicated the problem is, how much time you have and many other factors.

Languages are tools, not a religion. The broader your knowledge the more tools you have at your disposal. Pick the best one for the job at hand.

Depends on the task (4, Insightful)

DigitalCrackPipe (626884) | more than 8 years ago | (#15521025)

Ok, assuming the post isn't flamebait... This issue keeps coming up. A good programmer should understand that the language choice depends on the task at hand.

If you're making a pretty GUI, you may want to use an easy-to-use and portable language and may not care about performance as much. If you're creating a high-performance backend, or doing some realtime processing, an interpreted language is practially useless.

Before deciding which paradigm is superior, you must narrow down the question to a type of task. Since the variety of tasks we use software for does not seem to be shrinking, it seems that this issue will not be resolved decisively anytime soon.

Competition? Progress? (1)

maccam94 (840004) | more than 8 years ago | (#15521032)

This isn't necessarily good for the processor market, because it makes everyone stick to the same standards, even if they're ineffiecient. Oh cool, Company X has designed a processor with a cool new feature that will make programs wayyyyy faster! Oh wait, there's that darn high level language, too bad it'll never be used. Of course, it may be beneficial, because people's progress would be channeled into one direction, however this is only useful for so long, before true innovation almost comes to a standstill.

Just my $0.02.

Embedded Platforms (2, Insightful)

Anonymous Coward | more than 8 years ago | (#15521048)

I'm a Software Engineering Consultant who works in the avionics industry. While the enterprise application market may be moving that way, avionics certainly is not. I program about 80% C code, 15% C++ (helper apps), and 5% Assembly Code. The cost involved in certifying any sort or interpreter or JIT compiler is not worth it to anyone doing saftey critical RTOS work. Any speed in application development would be offset by several orders of magnitude in the effort to make that language available to the embedded platform through a certification process which would likely take a large team of developers several years. And maybe by then the cost of the hardware will have dropped enough to support it. Right now the product I work on has 512kB of Flash and 1MB of RAM, and it runs on a ultra low power 20MHz processor.

But for all you folks working in the application sphere, I'm sure things are heading that way as portability becomes a larger issue between OS's and architectures as the market moves away from a Windows based x86 monoculture. But for those of us working on products that include hardware, extra cycles aren't in the budget since they drive up the per unit cost.

Re: The End of Native Code? (0)

Anonymous Coward | more than 8 years ago | (#15521059)

Subject: Re: The End of Native Code?
Body: No.

Let lose the dogs of war.... (1)

otis wildflower (4889) | more than 8 years ago | (#15521080)

... there's no time to loose!

(or is that "there's No-Time Toulouse! [orangecow.org] ")

Layers of Indirection... (1)

LordZardoz (155141) | more than 8 years ago | (#15521081)

The one edge that naitve code will always have over interpreted code is that naitive code can just be loaded and fed to the CPU. An added concern is that most comptuers are doing more than one thing at a time. If all the CPU had to do was interpret one program and run it, that would be one thing. But if the CPU is trying to run 5 or 12 apps, and it has to interpret more than one of them, then there will be a bit of a logjam.

END COMMUNICATION

It depends (4, Insightful)

Sloppy (14984) | more than 8 years ago | (#15521141)

Interpreted & JIT languages are "within a constant factor" of native code's speed, and CS students are taught that such things don't matter. ;-)

And for many types of apps, they really don't. Ten times slower than instantaneous, is instantaneous.

But people use computers for lots of things, and believe it or not, some of those things are still CPU-bound, and take so much work that humans can perceive the delay. Your word-processor is 99% idle so surely it doesn't need to be native, but you know that somewhere on this planet, a poor shmuck is staring at an hourglass icon, waiting for a macro to finish. The real question is: who cares? Is that guy's time worth more, or is the programmer's time worth more?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?