Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Java Program Uses Neural Networks To Monitor Games

Soulskill posted more than 5 years ago | from the automating-the-automation dept.

Java 100

tr0p writes "Java developers have used the open source Neuroph neural network framework to monitor video game players while they play and then provide helpful situational awareness, such as audio queues when a power-up is ready or on-the-fly macros for combo attacks. The developers have published an article describing many of the technical details of their implementation. 'There are two different types of neural networks used by DotA AutoScript. The first type is a simple binary image classifier. It uses Neuroph's "Multi-Layer Perceptron" class to model a neural network with an input neurons layer, one hidden neurons layer, and an output neurons layer. Exposing an image to the input layer neurons causes the output layer neurons to produce the probability of a match for each of the images it has been trained to identify; one trained image per output neuron.'"

cancel ×

100 comments

Sorry! There are no comments related to the filter you selected.

yawn (0, Troll)

Adolf Hitroll (562418) | more than 5 years ago | (#27948117)

Whoever found this cool should be waterboarded...

Can it.... (2, Funny)

Barny (103770) | more than 5 years ago | (#27948127)

Ohhh, can it tell me when to move and shoot as well? Hey then interface it with the keyboard and mouse inputs and all my games can play themselves (like masturbation for computers).

Then I can do other things while having fun playing games.

Cheating? (1)

Warlord88 (1065794) | more than 5 years ago | (#27948145)

Ohhh, can it tell me when to move and shoot as well?

Maybe not now. Currently its only for DotA. But it might easily be extended to other games as you say. If so, won't cheating be easy in multiplayer games? How can that be prevented?

Re:Cheating? (2, Insightful)

maxume (22995) | more than 5 years ago | (#27949465)

Don't play against jackasses. Makes public servers a bit harder to deal with, but it is an easy solution otherwise.

Re:Can it.... (1)

Jurily (900488) | more than 5 years ago | (#27948325)

Ohhh, can it tell me when to move and shoot as well?

It's emulating the human brain in a VM.

My equivalent implementation: while(true) {sleep(HALF_AN_HOUR); printf("You need to respawn.");}

Re:Can it.... (0)

Anonymous Coward | more than 5 years ago | (#27948833)

If you want to play games while you play games, I think I know someone who can help.

Re:Can it.... (2, Funny)

fractoid (1076465) | more than 5 years ago | (#27949631)

And he'll call you 'dawg'? :P

Re:Can it.... (1)

wisty (1335733) | more than 5 years ago | (#27949225)

It can also lock you in out of the airlocks, but only by accident.

Sorry, I'm just a bit upset that it's 2009, and our biggest problem with computers is still that they just aren't smart enough.

Re:Can it.... (2, Interesting)

Gastrobot (998966) | more than 5 years ago | (#27949739)

I wrote a program in Java that used an artificial neural network to play Warning Forever [wikipedia.org] . I took an evolutionary approach to training the networks because it was an unsupervised task. The program started with a pool of random networks, determined fitness by whatever criteria I used on a given run, and bred the networks together to make a new generation.

My program had no capacity to play the game with a different interface than a human. It actually read the values of the pixels on the monitor, processed them through its network, and yielded keystrokes.

It never got very good and eventually I got bored and moved on. I have some ideas for yielding better results that I want to try someday. Here's what happened:

1) Playing on lives didn't work because eventually a network would destroy the boss' offensive capabilities and hide in a corner. The game would never progress.
2) When I tried playing the game on a timer and ranking networks by what level they got to it ended up not moving at all and just shooting forward to destroy some bosses.
3) When I tried ranking by time survived the network would again just destroy the boss' offensive capability and hide.

1 is technically a perfect solution for the fitness criteria that I supplied. 2 and 3 are both examples of local minima where the networks found an early solution that dominated the competition (other networks, not the boss) and thus the gene pool.

Re:Can it.... (1)

skelterjohn (1389343) | more than 5 years ago | (#27950247)

1) Playing on lives didn't work because eventually a network would destroy the boss' offensive capabilities and hide in a corner. The game would never progress.

Why on earth would that poor guy want to get to a new level for more hurt? He's a lover, not a fighter! Why can't we all just get along?

Re:Can it.... (0)

Anonymous Coward | more than 5 years ago | (#27950627)

I've never even seen the game you're talking about, but what about:

  (4) Maximize (Level - X * lives-lost) in time T.
  (5) Minimum (time + Y * lives-lost) to level N.

Re:Can it.... (1)

iiiears (987462) | more than 5 years ago | (#27950639)

Could this be adapted to recognize Desktop Interfaces,Text, Menus, Tabs, Etc? A blind user might find it useful and portable.

Re:Can it.... (0)

Anonymous Coward | more than 5 years ago | (#27951123)

Screw that! Can it be adapted to recognize jiggling boobs and thongs? It's a waste of space on the DVR to record some shows on the spanish channels when I can't speak the language.

Re:Can it.... (0)

Anonymous Coward | more than 5 years ago | (#27950685)

2 and 3 are both examples of local minima where the networks found an early solution that dominated the competition (other networks, not the boss) and thus the gene pool.

There's a solution for that. A good genetic algorithm needs some random mutations thrown in on the breed cycle. The idea is to knock a few offspring away from the current "hill" and possibly find a higher local extreme. Do that enough times, and it should find the global extreme and therefore the most optimal solution.

Re:Can it.... (1)

KDR_11k (778916) | more than 5 years ago | (#27950993)

WF is score based, track that.

Re:Can it.... (1)

declain (1338567) | more than 5 years ago | (#27955557)

I also wrote similar program to play old NES games and it turned out that the best network used some unknown strategy to beat the game: Up-Up-Down-Down-Left-Right-Left-Right B, A, Start

Re:Can it.... (1)

khellendros1984 (792761) | more than 5 years ago | (#27957027)

Couldn't you make it base its fitness value on how many bosses it destroys or something?

Re:Can it.... (1)

oneplus999 (907816) | more than 5 years ago | (#27953155)

Ohhh, can it tell me when to move and shoot as well?

I know this is a joke, but you've clearly never played DotA :) It has the most advanced gameplay of literally any game I've ever played in all my years of gaming. Saving me a couple of keystrokes is all this program does. I've been using it for a couple weeks now. While it is helpful, it's hardly playing for me.

Re:Can it.... (1)

Barny (103770) | more than 5 years ago | (#27995009)

Nope, infact, I don't even know what it stands for, the site linked doesn't tell me either, I assume from the links all over the site its some WoW x-pac?

Huh. (4, Informative)

Count Fenring (669457) | more than 5 years ago | (#27948155)

Probably no one cares, but that's the wrong "queues" there. They mean "cues."

Re:Huh. (2, Funny)

julesh (229690) | more than 5 years ago | (#27948577)

Probably no one cares, but that's the wrong "queues" there. They mean "cues."

I was wondering why I had trouble parsing that sentence, but didn't spot the reason. Thanks. :)

Re:Huh. (1)

dna_(c)(tm)(r) (618003) | more than 5 years ago | (#27948881)

Eye two thing dose spelt checkers r a mazing.

Re:Huh. (3, Funny)

jonaskoelker (922170) | more than 5 years ago | (#27949231)

Probably no one cares, but thats the wrong "queues" their. They mean "cues."

Broke that for you.

Re:Huh. (1)

Count Fenring (669457) | more than 5 years ago | (#27951937)

HA!

It's not Slashdot's most annoying meme, but an incredible imitation!

Well done, sir. Well and truly done.

Re:Huh. (1)

ricky-road-flats (770129) | more than 5 years ago | (#27950861)

I do care, and it's good to know someone else does. I've seen so many queues recently where there should have been cues, it's getting nearly as bad as the misuse of 'momentarily'.

The '90-ties are over (2, Informative)

Anonymous Coward | more than 5 years ago | (#27948161)

Why on earth are people still wasting their time on Neural Networks? Sure, they have a catchy name, but everything else about them sucks. Today we have much more robust methods available, e.g. Relevance Vector Machines, etc.

Re:The '90-ties are over (3, Funny)

Anonymous Coward | more than 5 years ago | (#27948193)

I prefer ties from the 1960s.

Re:The '90-ties are over (1)

jowilkin (1453165) | more than 5 years ago | (#27948231)

Agreed, all they are doing is image recognition. Neural networks are definately not the best way to accomplish this. They sure sound catchy to the average Joe though!

Re:The '90-ties are over (1)

skelterjohn (1389343) | more than 5 years ago | (#27950277)

Easy to implement. First thing that seems plausible that gets taught in a machine learning course. And sometimes, like, say... HERE.... it works pretty well!

Re:The '90-ties are over (1)

Cheapy (809643) | more than 5 years ago | (#27956173)

It's open source, it's gonna take a while to catch up :p

Nehru netowrks? (1)

gringofrijolero (1489395) | more than 5 years ago | (#27948243)

I thought that went out with the 60s.

Hilarious Overkill (4, Insightful)

phantomcircuit (938963) | more than 5 years ago | (#27948251)

So they designed and wrote a neural network for the sole purpose of identifying a limited set of icons? Seriously?

They could have done this using conventional methods that would be significantly faster. Me thinks someone was just doing this for entertainment.

Re:Hilarious Overkill (1)

physburn (1095481) | more than 5 years ago | (#27948365)

They we're doing it for a gradiate thesis. At least Neuroph was built for a thesis. What conventional method for image recognition do you think would be faster?

Re:Hilarious Overkill (0)

Anonymous Coward | more than 5 years ago | (#27948393)

multi resolution analysis perhaps? an example of this method is wavelet decomposition.

Re:Hilarious Overkill (4, Insightful)

julesh (229690) | more than 5 years ago | (#27948497)

multi resolution analysis perhaps? an example of this method is wavelet decomposition.

Which is even more processor-intensive than a moderately sized neural net.

How about NO image recognition? (1)

Moraelin (679338) | more than 5 years ago | (#27948395)

They we're doing it for a gradiate thesis. At least Neuroph was built for a thesis. What conventional method for image recognition do you think would be faster?

I don't know what the GP had in mind, but for my take: How about _no_ image recognition in the first place?

If you need to see when a game icon is activated, how about just looking at the byte that stores the state for that icon?

Re:How about NO image recognition? (5, Informative)

happylight (600739) | more than 5 years ago | (#27948585)

It's an external program that doesn't have access to the game's internal data structures. Basically it's a bot without hooks to the game and makes decisions solely by looking at the screen.

Re:How about NO image recognition? (4, Insightful)

Moraelin (679338) | more than 5 years ago | (#27948683)

As someone who's been writing external trainers for games for years (though admittedly it was some years ago), I can assure you first hand that accessing a game's internal data structures is indeed very possible.

And even if I couldn't find that boolean, I'd at least try to hook the point where it tries to draw that icon.

The idea of using image recognition on the screen is so horribly inefficient a method... I suppose it could be used if absolutely nothing else works, but really that's about it.

Re:How about NO image recognition? (1)

ChienAndalu (1293930) | more than 5 years ago | (#27948989)

I tried this once and never came very far - can you give me a few 'pointers' on where to find resources and websites that explain these techniques?

Re:How about NO image recognition? (0)

Anonymous Coward | more than 5 years ago | (#27962487)

Get used to a debugger and a disassembler. I recommend Ollydbg and IDA Pro. If the game comes with a copy protection system, get a cracked copy of the executable first (breaking those is more of an exercise in frustration than a challenge). I have not personally made any trainers for games, I like keygens more, but the basic idea is to set a breakpoint to an useful Win32 API function (something to do with drawing the icons, perhaps), and trace back from there. Making periodic copies of the game's .data section and diffing them can also reveal relevant variables (you might have to monitor any data structures present in there as well). I think there even is a program to automatically do this.

Once you have figured out how the game works, go read up on the Windows process debugging API. If ReadProcessMemory/WriteProcessMemory fail, you can alternative inject a thread of your own in the game process.

Re:How about NO image recognition? (2, Interesting)

skelterjohn (1389343) | more than 5 years ago | (#27949969)

Except to do something like that you have to analyze the program code of every game you make a trainer for. I've never done that sort of thing, but it seems scary.

An approach like the one they've outlined can probably be moved from game to game with only parameter tweaks.

That is the true beauty of machine learning :)

Re:How about NO image recognition? (1)

oneplus999 (907816) | more than 5 years ago | (#27953095)

I can assure you first hand that accessing a game's internal data structures is indeed very possible.

It could actually be simpler to do it his way. Icons never move, and he already knows where they are going to be on the screen, so no need to figure out what the data under the hood looks like. While it's certainly more processor intensive, and probably less elegant, Warcraft 3 is a pretty old game, so I don't think it's a big deal for most computers to handle the extra load. Or he could just be doing it cause it's more fun this way :)

Re:How about NO image recognition? (1)

vikstar (615372) | more than 5 years ago | (#27960779)

Accessing a game's internal data structures could often be against the EULA, whereas reading your graphics card's frame buffer isn't.

Re:Hilarious Overkill (0, Offtopic)

Bored Grammar Nazi (1482359) | more than 5 years ago | (#27948413)

They we're doing it for a gradiate thesis. At least Neuroph was built for a thesis. What conventional method for image recognition do you think would be faster?

I think you meant "they were" instead of "they we are".

Re:Hilarious Overkill (0)

Anonymous Coward | more than 5 years ago | (#27949133)

They we're doing it for a gradiate thesis. At least Neuroph was built for a thesis. What conventional method for image recognition do you think would be faster?

I think you meant "they were" instead of "they we are".

Hah! You missed the "gradiate"!

Re:Hilarious Overkill (1)

fractoid (1076465) | more than 5 years ago | (#27949645)

Nah, their thesis was just marked in fine increments.

Re:Hilarious Overkill (1)

deander2 (26173) | more than 5 years ago | (#27949805)

i would have tried non-negative matrix factorization personally.

Re:Hilarious Overkill (1)

skelterjohn (1389343) | more than 5 years ago | (#27949935)

Template matching might have been faster. If you know exactly what image you're looking for, you can find it in time linear with the number of screen pixels. I didn't bother reading the article, but it seems unlikely that their algorithm started with the icons it was looking for. I am not an expert on neural nets specifically (though I am a machine learning researcher so I have a passing familiarity with them) but I don't know how you would give it that knowledge. It's possible that the NN figured out the icons on its own, which would be a *very* cool result.

Re:Hilarious Overkill (1)

skelterjohn (1389343) | more than 5 years ago | (#27950047)

Ok, I skimmed the article.

They don't need to find the image - they start out knowing where it is, so stuff like template matching is irrelevant.

And since the NN started out knowing where the icons were, figuring out which ones matched which signals is not as impressive. It's just a ho-hum NN application, now.

Re:Hilarious Overkill (2, Informative)

Another, completely (812244) | more than 5 years ago | (#27948391)

Me thinks someone was just doing this for entertainment.

Almost certainly, especially since a complete success would just mean they can play video games slightly more efficiently.

This toolkit worked for them, but does using a neural networking toolkit mean that what you produce is a neural network? It looks like the output neurons are doing image matching, and the hidden layer is identifying interesting candidates from a stream. In their environment, interesting candidates are any box that ticks from dim to bright (so they can spot the re-charged state when it gets fully lit).

As described, it sounds more like a pipeline than a network. They use training data, rather than hard-coding target images, but it's not clear to me that the training feedback goes between neurons. It looks like you could do unit testing on the individual neurons, which doesn't describe neural networks as I understand them.

I still think it's a neat tool that they made, but would some AI geek out there like to comment on calling it a neural network?

Re:Hilarious Overkill (2, Informative)

jowilkin (1453165) | more than 5 years ago | (#27948455)

It IS a neural network. A neural network is a mathematically well defined thing. That doesn't mean it was a good idea to use a neural network in this case (it wasn't IMO). I certainly wouldn't sign off on this as a thesis, what is the research aspect supposed to be? Using neural networks for image recognition was done and recognized to be a bad idea a really long time ago.

Re:Hilarious Overkill (0, Troll)

LingNoi (1066278) | more than 5 years ago | (#27949853)

Using neural networks for image recognition was done and recognized to be a bad idea a really long time ago

Yet you don't say why. -1 Informative.

Re:Hilarious Overkill (1)

ClosedSource (238333) | more than 5 years ago | (#27952179)

Re:Hilarious Overkill (1)

LingNoi (1066278) | more than 5 years ago | (#27952271)

All your link demonstrates is that you need to get your training data correct. This doesn't tell me anything about why neural networks are "a bad idea".

Re:Hilarious Overkill (2, Interesting)

ClosedSource (238333) | more than 5 years ago | (#27952461)

The point is that you have no way of determining what training data is "correct" because you don't know anything about what the NN is "looking at".

There also no guarantee that the network will ever converge and if it does there's no way to know if it has converged to a local minimum which isn't the solution rather than a global minimum.

Re:Hilarious Overkill (-1, Troll)

Jurily (900488) | more than 5 years ago | (#27948421)

So they designed and wrote a neural network for the sole purpose of identifying a limited set of icons? Seriously?

IN JAVA. Speed was obviously not in the design criteria.

Re:Hilarious Overkill (4, Insightful)

mcvos (645701) | more than 5 years ago | (#27948843)

IN JAVA. Speed was obviously not in the design criteria.

The '90s are over. Java is now one of the fastest languages around.

Re:Hilarious Overkill (-1, Troll)

Jurily (900488) | more than 5 years ago | (#27948885)

The '90s are over. Java is now one of the fastest languages around.

Around where?

Re:Hilarious Overkill (2, Funny)

molecular (311632) | more than 5 years ago | (#27949067)

The '90s are over. Java is now one of the fastest languages around.

Around where?

Everywhere, that's the cool thing about java, you know.

Re:Hilarious Overkill (4, Insightful)

daid303 (843777) | more than 5 years ago | (#27949147)

Except on everything not x86. The speed you currently see on desktop/server java is only accomplished by very good JustInTime compilers. Which are tweaked for x86. So everything else runs java like crap.

http://en.wikipedia.org/wiki/ARM_architecture [wikipedia.org] And that includes a few bilion ARM prosessors used in mobile phones. Sure they can run java, but it's nowhere near as fast as C.

Java is nice, but the 'runs everywhere' feature is the least interesting one of them all. I can run an emulated a full blown x86 on a 8bit microcontroller, but that does not make it useful.

Phoneme Advanced has JIT for ARM (0)

Anonymous Coward | more than 5 years ago | (#27949601)

I've got boxes running it all over my office. Ok, it's Java 1.4 so you have to use retroweaver if you want generics, but it is pretty zippy. Also Sun has embedded jvms for powerpc and arm.

Re:Hilarious Overkill (1, Interesting)

Anonymous Coward | more than 5 years ago | (#27949693)

Did you miss the part where some ARM processors can execute Java byte code natively?

Re:Hilarious Overkill (1)

rackserverdeals (1503561) | more than 5 years ago | (#27949761)

Except on everything not x86. The speed you currently see on desktop/server java is only accomplished by very good JustInTime compilers. Which are tweaked for x86. So everything else runs java like crap.

Have you not heard of Jazelle [arm.com] ? It supprots just in time and ahead of time compilers for ARM and is mentioned in the wikipedia link you gave.

It provides direct execution of Java bytecode and was announced in 2000 [eetimes.com] .

Re:Hilarious Overkill (0)

Anonymous Coward | more than 5 years ago | (#27959951)

Except on everything not x86.

I've heard rumors that SPARC has pretty decent JAVA JIT compilers too. /facepalm

Re:Hilarious Overkill (0)

Anonymous Coward | more than 5 years ago | (#27963169)

-1, Ignorant.

Seriously, wtf?

Re:Hilarious Overkill (1, Informative)

mangu (126918) | more than 5 years ago | (#27949197)

Java is now one of the fastest languages around.

Is your application CPU-limited? If so, is it *the* fastest language? Those are the questions one should be asking when picking a programming language.

If your application is limited by the CPU, only the fastest language, C, will do for some routines. You may even consider using assembly or machine-optimized code such as Atlas [sourceforge.net]

If your application isn't limited by the CPU, then development speed is more important than execution speed. A rule of thumb I use is how big is the development team. If there are just a few people, or if the developers work more or less independent of each other, I'd recommend Python.

Java development, in my experience, is more laborious than Python or Ruby. Unless you have big teams of developers who must work close together, I wouldn't recommend Java for anything.

That has nothing to do with how efficient compilers have become in the last years.

Re:Hilarious Overkill (4, Insightful)

mcvos (645701) | more than 5 years ago | (#27949461)

Is your application CPU-limited?

No, it's developer-limited. For most applications, development time is a bigger issue than execution speed. Only for very heavily used low-level routines (OS stuff, graphics libraries, VMs, etc) is it really worthwhile spending extra effort on extreme optimisation.

If so, is it *the* fastest language?

I don't have any recent benchmarks, but I remember that back in the days of the Java 5 JVM, Java is about 10% slower than equivalent C++, which is pretty good. But since then, JVMs have gotten quite a bit faster. It would surprise me if Java was not on at least equal terms with C++ now, alhough highly optimised low-level C is still going to be faster. But that's also extremely tedious to code.

Those are the questions one should be asking when picking a programming language.

No, the main question you need to ask when picking a language is if your code is going to be maintainable, and how expensive you can afford your maintainance to be. That's still the main timesink in development.

If your application is limited by the CPU, only the fastest language, C, will do for some routines. You may even consider using assembly or machine-optimized code such as Atlas [sourceforge.net]

You accidentally hit the nail right on the head there: C is not necessarily the fastest language, highly optimised custom assembly is. And any language is only as fast as it can be if the programmer knows what he's doing. Some language do more for you to make optimal code easy to write than others.

Java development, in my experience, is more laborious than Python or Ruby. Unless you have big teams of developers who must work close together, I wouldn't recommend Java for anything.

Oh, I agree, Java stopped being an easy development language quite some time ago, and moved to the side of the fast execution languages. This is also why I switched from Java to Ruby. However, I just might switch to Scala because recent JVMs are so incredibly cool. The power of Java these days is more in the awesomeness of the JVM than in the language itself.

Even so, there is an enormous amount of support for Java. It is by far the biggest language for enterprisey server stuff. I think there are as many webframeworks for Java as there are for all other programming languages put together. This is one of the big stengths of Java, but at the same time, this architectural overload is also one of the major hurdles for starting in Java.

However, my point was that Java is pretty fast, which it is. If speed is an issue, Java can be an excellent choice (unlike Ruby, for example). If speed is the only thing that matters, then highly optimised C or assembly is really the only option.

Re:Hilarious Overkill (1)

rackserverdeals (1503561) | more than 5 years ago | (#27950751)

The "problem", as you pointed out, is that Java has a lot of options while Ruby has Rails and Python has Django and that's about it. You could build a similar tool set with Java/stack with Java to increase productivity. There are Java MVC frameworks, CRUD application generators, persistence strategies, code generators, and so on. A good Java IDE [netbeans.org] helps peice them together.

It could/should be better though but with the right set of tools/frameworks/libraries you can be very productive in Java.

Re:Hilarious Overkill (0)

Anonymous Coward | more than 5 years ago | (#27954561)

Use scala, development time of Ruby/Python. (probably better in some cases) and runs on jvm runtime, and can use Java libraries, best of both worlds.

Re:Hilarious Overkill (1)

ultranova (717540) | more than 5 years ago | (#27954593)

If your application is limited by the CPU, only the fastest language, C, will do for some routines. You may even consider using assembly or machine-optimized code such as Atlas

If your application is limited by the CPU, you'll need to parallelize it to take advantage of multicore CPUs. C is probably the worst possible choice there, with the possible exception of assembly.

Furthermore, while low-level languages allow for all kinds of tricks in the hands of a master, the chances are that you are close to an average coder, which means that you won't likely see much if any speed advantage. Assembly also has the problem of not being readily portable.

If there are just a few people, or if the developers work more or less independent of each other, I'd recommend Python.

Python's main problem, as I see it, is that objects have a type that's defined by the methods they support, yet the language pretends that they are typeless. Consequently there is no type checking at program launch, yet you can still get a type mismatch error at runtime. It's bloody annoying.

Re:Hilarious Overkill (1)

MikeFM (12491) | more than 5 years ago | (#27951737)

After the half hour it takes to load. ;)

Re:Hilarious Overkill (1)

chthonicdaemon (670385) | more than 5 years ago | (#27954661)

Languages aren't fast -- compilers are efficient. How "fast" a language is perceived to be is purely a function of whether an efficient compiler exists for the particular platform.

Re:Hilarious Overkill (1)

HAWAT.THUFIR (928140) | more than 5 years ago | (#27981025)

Languages aren't fast -- compilers are efficient. How "fast" a language is perceived to be is purely a function of whether an efficient compiler exists for the particular platform.

Plus, you can always compile Java to native code.

Re:Hilarious Overkill (2, Insightful)

WankersRevenge (452399) | more than 5 years ago | (#27949449)

I didn't it was slashdot's five minute hate already. Jeez its early. Anyone have an stray pictures of Gosling that I can yell at? Being somewhat serious, why does the slashdot groupthink give C# a free pass whereas java gets all the hate? I haven't looked, but I assumed both are similar in performance. Is it because one of them is integrated right into gnome? Or is it because Java is the most popular language for enterprise development?

Re:Hilarious Overkill (0, Redundant)

fractoid (1076465) | more than 5 years ago | (#27949675)

Maybe because MS hasn't changed the .NET API every 5 minutes with stupid trivial changes, then deprecated the old versions? Java had a horrible case of too many cooks.

Re:Hilarious Overkill (0, Flamebait)

rackserverdeals (1503561) | more than 5 years ago | (#27950355)

Java had a horrible case of too many cooks.

Yeah, screw all this collaboration crap. Lets get back to a single vendor proprietary solution. Someone please inform sourceforge that we will no longer be needing their services.

Re:Hilarious Overkill (1)

fractoid (1076465) | more than 5 years ago | (#27950387)

I dunno about you but I prefer my soup to be a "proprietary solution" made by a "single vendor" and advertised as, say, "pumpkin soup". I would really hate to try and eat soup made by a group of volunteers, some of whom think they're making pumpkin and leek soup, but most of whom are busy arguing over whether they're making greek salad, hot-and-sour soup, or nachos.

Re:Hilarious Overkill (1)

jsight (8987) | more than 5 years ago | (#27950537)

Maybe because MS hasn't changed the .NET API every 5 minutes with stupid trivial changes, then deprecated the old versions? Java had a horrible case of too many cooks.

Deprecated APIs have not been removed, and haven't changed nearly that often. So what are you really complaining about?

Re:Hilarious Overkill (1)

WankersRevenge (452399) | more than 5 years ago | (#27950579)

If anything, MS keeps things far too long. Ars wrote up an excellent article on it. Here's a snip:

For example, there's a function called OpenFile. OpenFile was a Win16 function. It opens files, obviously enough. In Win32 it was deprecated--kept in, to allow 16-bit apps to be ported to Win32 more easily, but deprecated all the same. In Win32 it has always been deprecated. The documentation for OpenFile says, "Note: Only use this function with 16-bit versions of Windows. For newer applications, use the CreateFile function." But in spite of that, Win64 still has OpenFile. No one should be using it, but it's still there.

from: http://arstechnica.com/apple/news/2008/05/microsoft-learn-from-apple-II.ars/3 [arstechnica.com]

The good thing about java is that when the api changes you still have the option of running older JREs. I'm not sure if that's the case with the net framework.

Re:Hilarious Overkill (1)

obijuanvaldez (924118) | more than 5 years ago | (#27951641)

The good thing about java is that when the api changes you still have the option of running older JREs. I'm not sure if that's the case with the net framework.

The .NET Framework is similar to the JRE is this regard. You can target a build to use whichever version of the .NET Runtime you choose. That is, if your application targets the .NET Framework 2.0, it will work wherever the .NET 2.0 Runtime is installed, regardless of any changes made in the .NET Framework 3.5.

As far as the underlying Windows API changes, a motivation for the .NET Framework is to abstract development away from the OS API. It's similar to the approach used with Java, although primarily implemented for abstracting from the various versions of Windows rather than any OS.

Re:Hilarious Overkill (1)

Mr2001 (90979) | more than 5 years ago | (#27960217)

Being somewhat serious, why does the slashdot groupthink give C# a free pass whereas java gets all the hate? I haven't looked, but I assumed both are similar in performance.

.NET (including C#) keeps more metadata around for the JIT to use, which could theoretically result in improved performance.

One example is generics: if you use ArrayList<int> in Java, you end up with a bunch of boxing/unboxing overhead at runtime, because the compiler erases the type and turns it into a list of objects. List<int> in .NET remains a list of primitives, so there's no boxing.

On the other hand, I suspect more effort has gone into optimizing JVMs than optimizing the CLR, so I'm not sure whether the extra metadata actually translates into better performance.

Re:Hilarious Overkill (1)

HAWAT.THUFIR (928140) | more than 5 years ago | (#27980925)

One example is generics: if you use ArrayList<int> in Java, you end up with a bunch of boxing/unboxing overhead at runtime, because the compiler erases the type and turns it into a list of objects. List<int> in .NET remains a list of primitives, so there's no boxing.

While C# is better in some regards, I don't think your description of Java is quite correct. AFAIK the boxing/unboxing occur at compile (to bytecode) time and the JVM sees *exactly* the same code as if you'd done List int -- that is, generics is strictly a syntactic sugar in Java which can only be detected at compile time. Absolutely, it's time for Java to die and rise from the ashes as Java2, or just die completely and let JRuby/scala/whatever take over. Sure, c# is a tad better in some regards, but what's the point of a VM which isn't cross platform?

Re:Hilarious Overkill (1)

Mr2001 (90979) | more than 5 years ago | (#27982449)

AFAIK the boxing/unboxing occur at compile (to bytecode) time and the JVM sees *exactly* the same code as if you'd done List int -- that is, generics is strictly a syntactic sugar in Java which can only be detected at compile time.

I can't tell if you're disagreeing: that sounds just like what I said.

The Java compiler turns all ArrayList<T> types into the same ArrayList type. The "T" part is erased, and the compiler inserts casts or boxing/unboxing operations to convert between "T" and "object", effectively changing a statement like "list.add(5)" into "list.add(new Integer(5))".

The boxing/unboxing opcodes are inserted at compile time, but those opcodes cause boxing/unboxing operations to take place at runtime. That means more runtime overhead. Instead of just copying an int into the array, now you're allocating a new Integer on the heap and copying that reference into the array.

Sure, c# is a tad better in some regards, but what's the point of a VM which isn't cross platform?

Not sure what you mean here. That VM is an ECMA standard, and there are multiple implementations of it on various platforms, most notably Mono.

Re:Hilarious Overkill (1)

kaffiene (38781) | more than 5 years ago | (#27959523)

Moron.

I have a console written in Java. It opens in a fraction of a second.

Re:Hilarious Overkill (1)

adamkennedy (121032) | more than 5 years ago | (#27948433)

Indeed.

I mean, I built an identical thing to what they've built (image recognition engine for fixed icons) once using the Perl regular expression engine [cpan.org] (mostly just to prove it could be done). It was pretty awesome.

But I have no illusions that it is the sort of thing that I should be promoting on Slashdot. ...oh wait...

Re:Hilarious Overkill (1, Informative)

Anonymous Coward | more than 5 years ago | (#27948821)

That's a completely differente approach.
Neural network can detect similar images - the perl solution can detect only binary equivalent images.
They have choosen this solution as a proof of concept, not for speed.

Re:Hilarious Overkill (1)

adamkennedy (121032) | more than 5 years ago | (#27961523)

Not necesarily, there's ways to implement the equivalent of alpha channel support in the regex.

I agree though, that you can't do a percentage match like the neural net does.

Re:Hilarious Overkill (2, Insightful)

cerberusss (660701) | more than 5 years ago | (#27948565)

So they designed and wrote a neural network for the sole purpose of identifying a limited set of icons? Seriously?

They used a library. And it's just an algorithm.

Re:Hilarious Overkill (2, Interesting)

grimJester (890090) | more than 5 years ago | (#27948947)

My reaction as well. My first thought was the neural network was used to learn/predict user actions based on what's on the screen and give sound cues tied to the predicted action. For example, when some critter spawns in front of you, the player usually casts Fireball -> network learns that and shouts "Fireball!" every time a monster spawns.

Neuroph look pretty cool (3, Informative)

physburn (1095481) | more than 5 years ago | (#27948255)

As a programmer, Neuroph looks like pretty good implement of Neural Networks, it handers most of the network types, Got built in graphic view of the output. Look nice, but which like as not, won't help, because you can rarely debug a neural network by looking at it.

I've use neural network and genetic programming a few time, in work. Its completely different to normal programming. Instead of understand a problem completely, and write a structured solution to the task. You get a network and try and train it until its output matches what you think the output should be, no programming involved.

Basshunter (2, Funny)

Tokerat (150341) | more than 5 years ago | (#27948283)

I just got that obnoxious song out of my head, and now they want to put the whole game in there? No thanks!

Re:Basshunter (2, Funny)

jonaskoelker (922170) | more than 5 years ago | (#27949243)

Next, they'll let players play the game through an IRC bot named Anna...

sounds cool (0, Offtopic)

droidsURlooking4 (1543007) | more than 5 years ago | (#27948371)

but it ain't no iTunes, yeah?!

Neural Network (0)

Anonymous Coward | more than 5 years ago | (#27949311)

I much prefer the idea of using the game to classify images http://www.gwap.com/gwap/gamesPreview/espgame/

No turning machine is cabable of true AI. Neural networks can help create programs with a new paradigm that is potentially a lot quicker the conventional methos. At the end of the day all these boolean machines only have the ability to follow rules...

If you are reading this then this warning is for you. Every word you read of this useless fine print is another second of your life. Don't you have other things to do? Is your life so empty that you honestly can't think of a better way to spend these moments? Or are you so impressed with authority that you give respect and credence to all who claim it? Do you read everything you're supposed to read? Do you think everything you're supposed to think? Buy what you're told you should want? Get out of your apartment. Meet a member of the opposite sex. Stop the excessive shopping and masturbation. Quit your job. Start a fight. Prove you're alive. If you don't claim your humanity you will become a statistic. You have been warned.......Tyler.

SecuROM has... (2, Funny)

zepo1a (958353) | more than 5 years ago | (#27949471)

SecuROM has detected JAVA on your computer! Please uninstall JAVA before launching the game.

BTDT (1)

Tablizer (95088) | more than 5 years ago | (#27951109)

I've seen this on Gilligan's Island already, where the professor hooked electrodes up to the Beatles wannabees while they played.

What is the game it monitors? (1)

Peganthyrus (713645) | more than 5 years ago | (#27954063)

What is DotA?

No, seriously; the article is of no use, and the link in the article to what appears to be the homepage of this DotA game that this code hacks on top of is also completely useless in telling me what kind of game this actually is; everything involved in this is a cryptic thicket of terms for this highly involved game I have never heard of.

Re:What is the game it monitors? (0)

Anonymous Coward | more than 5 years ago | (#27955257)

DotA (Defence Of The Ancients) originates from a Warcraft 3 multiplayer map that became and is still very popular going on for close to a decade now maybe (though battleships maps are betta!)- you can wikipedia for DotA info i know i seen it in there.

Re:What is the game it monitors? (1)

Peganthyrus (713645) | more than 5 years ago | (#27956589)

Thanks! I've never played any incarnation of Warcraft, so almost nothing about this article made a damn bit of sense to me.

Re:What is the game it monitors? (1)

Barny (103770) | more than 5 years ago | (#27976989)

Had the same problem, absolutely nowhere does it actually state the full name of the game, I saw a few links to WoW stuff on there and figured it was some add-in for the latest expansion...

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>