Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The First Evolving Hardware?

kdawson posted more than 7 years ago | from the many-generations dept.

Robotics 148

Masq666 writes "A Norwegian team has made the first piece of hardware that uses evolution to change its design at runtime to solve the problem at hand in the most effective way. By turning on and off its 'genes' it can change the way it works, and it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years." The University of Oslo press release linked from the article came out a few days ago; the researchers published a paper (PDF) that seems to be on this same technology at a conference last summer.

cancel ×

148 comments

Sorry! There are no comments related to the filter you selected.

I'm so, so sorry... (5, Funny)

DurendalMac (736637) | more than 7 years ago | (#18511911)

I, for one, welcome our new evolving hardware overlords.

God, I am so sorry, but it needed to be said...

Re:I'm so, so sorry... (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#18511965)

No, it didn't.

Re:I'm so, so sorry... (1)

packeteer (566398) | more than 7 years ago | (#18512351)

it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.

A human generation is not 40 years. Even today most people reproduce BEFORE they are 40. Historically our ancestors reproduced in their teens and early 20's.

Re:I'm so, so sorry... (1, Troll)

fyngyrz (762201) | more than 7 years ago | (#18513135)

That same number of generations took humans 800,000 - 900,000 years.

Mmmmm.... yeah, but it resulted in George Bush. If they're really serious about equivalent advantages, you could end up with an "evolved" CPU that tries to execute your software using a dialect of COBOL that is not only obsolete, but contains misspellings and incorrectly used operators... then when an error occurs, the system will insist on executing the operation anyway.

Actual George Bush quote:

"Our enemies are innovative and resourceful, and so are we. They never stop thinking about new ways to harm our country and our people, and neither do we." --Washington, D.C., Aug. 5, 2004

OMG TEH ROBOTZ HAVE RIZEN UPZORZ! (1)

Yr0 (224662) | more than 7 years ago | (#18512391)

TEH ROBOTZ AR COMING!
TEH ROBOTZ AR COMING!,look you fucking filter, i think the filter is a loat of spoons that are meshed together to create a giant spoooon mesh.>

Re:I'm so, so sorry... (1)

Seumas (6865) | more than 7 years ago | (#18511991)

I, for one, think my brain just broke. That blurb was really long for having absolutely no content or description of what the hell about the hardware was evolving. Maybe the religious nuts were right -- evolution really is evil!

Re:I'm so, so sorry... (4, Funny)

GrumpySimon (707671) | more than 7 years ago | (#18512263)

I, for one, welcome our brain-breaking evolving-hardware-Slashdot-blurb overlords.

Re:I'm so, so sorry... (0)

Anonymous Coward | more than 7 years ago | (#18512481)

I'm surprised this one didn't come up:

In Soviet Russia, computer evolves you!

Re:I'm so, so sorry... (4, Insightful)

Anonymous Coward | more than 7 years ago | (#18512893)

If anything demonstrates the mental retardation that afflicts Slashdot, it is the above.

Some idiot claims that a horrifically unfunny cliché needed to be repeated. Another person points out the falsity of that claim.

The first post is marked +5 Funny; and the second, -1 off-topic.

Just think about that for a second.

People, turn off your computers. Go outside. Breathe real air. Have sex. Get girlfriends. Stop posting on Slashdot and don't come back until you have gained the social skills and sense of humour possessed by any normal human being. Do it for me; do it for yourselves; do it for everyone.

ChameleonDave

Re:I'm so, so sorry... (4, Funny)

Dogtanian (588974) | more than 7 years ago | (#18513065)

Some idiot claims that a horrifically unfunny cliché needed to be repeated. Another person points out the falsity of that claim. The first post is marked +5 Funny; and the second, -1 off-topic.

Just think about that for a second.
Nope, sorry, I'd much rather think about Natalie Portman, naked and petrified and covered in hot grits. ;-P

(Incidentally, this article [everything2.com] tells us that Natalie Portman comments on Slashdot are "getting old... This Natalie Portman nonsense has been going on for months; it's not funny anymore." Note that the date is Oct 24 *2000*).

People, turn off your computers. Go outside. Breathe real air. Have sex. Get girlfriends
I turned off my computer, went outside, sniffed the air and had sex with some passing woman. Then the woman asked "Do I know you?" and we were arrested for public indecency.

Re:I'm so, so sorry... (1)

Luyseyal (3154) | more than 7 years ago | (#18513933)

(Incidentally, this article tells us that Natalie Portman comments on Slashdot are "getting old... This Natalie Portman nonsense has been going on for months; it's not funny anymore." Note that the date is Oct 24 *2000*).


Man, am I getting old.

-l

Re:I'm so, so sorry... (1)

2fakeu (443153) | more than 7 years ago | (#18513567)

/signed

slashdot comments are about +5 for "first posts" and "cliche posts". anything reflecting any nerd/geek movie/series will be rated up if quoted in an slashdotty manner. it's disgusting.

GA in hardware (1)

rmadhuram (525803) | more than 7 years ago | (#18511923)

Seems like implementation of GA in hardware.. the title seems misleading.

Re:GA in hardware (4, Informative)

wish bot (265150) | more than 7 years ago | (#18511967)

And it's been done before - at least once - http://www.newscientist.com/article.ns?id=dn2732 [newscientist.com] - there's another one too, but I can't find it right now. Crazy stuff though.

Re:GA in hardware (4, Informative)

wish bot (265150) | more than 7 years ago | (#18512061)

Ahh ha - found it - http://www.informatics.sussex.ac.uk/users/adrianth /cacm99/node3.html [sussex.ac.uk]


My favourite bit:

Yet somehow, within 200ns of the end of the pulse, the circuit `knows' how long it was, despite being completely inactive during it. This is hard to believe, so we have reinforced this finding through many separate types of observation, and all agree that the circuit is inactive during the pulse.
Crazy stuff indeed.

Re:GA in hardware (4, Interesting)

Dachannien (617929) | more than 7 years ago | (#18513269)

Yeah, it was pretty amazing. They mapped out a section of the circuit that the genetic algorithm came up with and found that when analyzed as a logic circuit, a large portion of the configured part of the FPGA should have had no effect on its behavior. When they cropped that section of the circuit out, though, the rest of it mysteriously stopped working.

This was because the configured circuit operated a lot of the transistors in linear (i.e., non-saturated) mode, taking advantage of things like parasitic capacitances and induced currents. No sane human would operate an FPGA in this fashion, but since those little anomalies were present, the GA took advantage of them. That's a recurring theme in GA research: if you are running a GA on a simulation, for example, and you have a bug in your simulation code, it's fairly likely that the GA will find and exploit that bug instead of giving you a normal answer. See Karl Sims's research from 1994 for some amusing examples of this.

Sadly, Xilinx discontinued that particular FPGA line a while back, so if you can't find some old leftovers of that part, you probably won't be able to recreate the experiments yourself (the research was originally done a decade or so ago). This is because that particular device had the advantage of being configurable in a random fashion without risk of burning it out due to things like +V to GND connections. Of course, Xilinx considers their programming interface to be proprietary, so I don't know that you'd be able to recreate that work even if you did manage to find the right part.

They call these FPGAs (1)

EmbeddedJanitor (597831) | more than 7 years ago | (#18512437)

They've been around for a long time... Send a new bitstream and you change the behavior.

By using a GA to change the bitstream, you can have evolving hardware. If the GA is itself in the hardware then it is self evolving.

Skynet. (4, Insightful)

headkase (533448) | more than 7 years ago | (#18511931)

For once Skynet jokes will be on topic!

Re:Skynet. (1)

malkir (1031750) | more than 7 years ago | (#18511949)

Honestly, the majority of our local Skynet addicts will be delighted to find themselves 'wanted'.

Re:Skynet. (1)

LoRdTAW (99712) | more than 7 years ago | (#18512301)

How many seconds before it decides to destroy humanity?

Been there, done that... (4, Funny)

creimer (824291) | more than 7 years ago | (#18511939)

My computer been evolving for the last ten years. Started with an AMD K6 233MHz CPU, 32MB RAM, and a Nvidia TNT 16MB video card. Now I have an AMD Athlon 64 2.2GHz, 1GB RAM, and a Nvidia Geforce 6200 128MB video card. I'm just waiting for the power supply to evolve so the system can support an ATI 512MB video card.

Re:Been there, done that... (5, Funny)

Yoozer (1055188) | more than 7 years ago | (#18512397)

Ha! I counter your evolution with irreducible complexity. Take out a part and it'll start to beep and won't do anything!
What good is half a graphics card, anyway? (and keep your heathen comments about SLI for yourself, please)

Re:Been there, done that... (1)

ajs318 (655362) | more than 7 years ago | (#18513175)

And I'll raise your irreducible complexity claim and toss in a paradox involving either an irreducibly complex designer arising spontaneously, or already-irreducibly complex life arising spontaneously. Either way, something irreducibly complex must have arisen spontaneously; but the second is simpler and passes both Occam's and Dawkins's razors.

It's the second... (1)

SanityInAnarchy (655584) | more than 7 years ago | (#18511947)

The first would be the biosphere.

Call me (4, Funny)

dcapel (913969) | more than 7 years ago | (#18511963)

Call me back when I can start a culture of Core Duos in a petri dish filled with a silicon nutrient.

Re:Call me (5, Funny)

ZX3 Junglist (643835) | more than 7 years ago | (#18512233)

Call me back when I can start a culture of Core Duos in a petri dish filled with a silicon nutrient.
If you do it right, your experiment will call you when it's done.

Re:Call me (4, Funny)

Virtual_Raider (52165) | more than 7 years ago | (#18512601)

If you do it right, your experiment will call you when it's done.
In Soviet Rusia... oh, forget it

Re:Call me (1)

YourExperiment (1081089) | more than 7 years ago | (#18514831)

Call me back when I can start a culture of Core Duos in a petri dish filled with a silicon nutrient.
If you do it right, your experiment will call you when it's done.
I'm done!

Re:Call me (2, Funny)

Yoozer (1055188) | more than 7 years ago | (#18512709)

You forgot the final ingredient: a stereo playing Barry White on low volume.

Misleading (4, Informative)

suv4x4 (956391) | more than 7 years ago | (#18511983)

At first glance, this is supposed to impress us with the hardware:

By turning on and off its 'genes' it can change the way it works, and it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.

In fact the simplest DNA based organisms/structures (bacteria, virii) have the shortest "life span". The number of generations per sec. isn't anything to brag about.

All complex organisms have some sort of lifespan longer than a microsecond. For a good reason: people pass on knowledge and adapt *during* their life span (not genetically of course, but our brain allows us to adapt a lot without such).

Hype aside, interesting development, but I wish those publications wouldn't use misleading statements in pale attempts to impress us.

Re:Misleading (0)

Anonymous Coward | more than 7 years ago | (#18512815)

"virii" is not a word

Re:Misleading (1, Insightful)

ajs318 (655362) | more than 7 years ago | (#18513217)

I'd argue that "virii" is a word: it's the plural of "virius". Whatever a virius is.

The proper English plural of "virus" is "viruses", and this is why. Words adopted into the English language generally retain the pluralisation from the donor language, barring a significant change in meaning (which is why beetles have antennae, but radios have antennas). However, "virus" in Latin is a stuff-word, not a thing-word, and therefore does not have a plural form. (If it did, it would be "viri" [one i; "-us" changes to "-i"].) The change in meaning from "some stuff" to "a thing" is big enough to trigger English pluralisation rules.

Re:Misleading (1)

mrbluze (1034940) | more than 7 years ago | (#18512833)

In fact the simplest DNA based organisms/structures (bacteria, virii) have the shortest "life span". The number of generations per sec. isn't anything to brag about. ... Hype aside, interesting development, but I wish those publications wouldn't use misleading statements in pale attempts to impress us.

It isn't exactly misleading, but perhaps just an unfair comparison. Computers (and computer science) have one thing over nature in that the science is perfect:

  • Computers don't need to reproduce, so the lifespan is irrelevant (the purpose of which is to obtain the energy and materials required to reproduce, which takes time)
  • Computers don't need to operate in micro-environments using error-prone biochemical events which are slow, using error correcting mechanisms which are even slower (but very clever because they only block 'nonsense' mutations).
  • The computer's adaptiveness is limited. The adaptive agility of a system is inversely proportional to its stability. Biology has almost unlimited potential for adaptation, whereas this computer system has fewer dimensions within which it can adapt.
  • Biology, whilst error prone, is also immensely fault tolerant. Errors just result in dead organisms which serve as food for other organisms. How would a computer go about managing massive numbers of dead processes? How does a computer adapt its "adaptive agility", as biology has? It's a very finely tuned system we are trying to emulate here.

Despite these differences, I do think that we are not very far off seeing truly evolving software. The problem of self evolving code is something which is beginning to be understood much better than it was in the past. This can only lead to greater interest and progress in the area.

Re:Misleading (2)

jotok (728554) | more than 7 years ago | (#18514171)

I dunno, I think it's still pretty cool. Organisms do all kinds of things that are really neat when we implement them with computers--genetic algorithms for instance. I don't think it detracts at all from the discovery to say "Pfft, my cells have been using GA since before I was born!"

And upon reading this, the luddites screamed... (1)

Seumas (6865) | more than 7 years ago | (#18512005)

The team first started to use evolution back in 2004 when they made the chicken robot "Henriette", yes a chicken. The chicken robot used evolution, this time software based to learn how to walk on its own.
Oh no! The sky is falling! The sky is falling!

Re:And upon reading this, the luddites screamed... (1)

Dancindan84 (1056246) | more than 7 years ago | (#18515019)

Did they make the robot egg before, or after?

Computer Evolution??? (5, Funny)

lord_mike (567148) | more than 7 years ago | (#18512011)

Bah! It ain't in the Bible! Next thing you know, you'll be telling me that Programs don't believe in the Users, and that we should just blindly accept the secular rule of the Master Control Program.

That's it... isn't it? It's all just an MCP trick!

Well, I still believe in and will fight for the users!!

Thanks,

Mike

Re:Computer Evolution??? (0, Offtopic)

Xiph (723935) | more than 7 years ago | (#18512093)

mod parent up, gotta love tron references.
captcha: presence

Re:Computer Evolution??? (3, Insightful)

sumdumass (711423) | more than 7 years ago | (#18512103)

Interesting you brought this up. This story/article is more or less a flame.

There is no content to it about the hardware and manages to deny creationism in the process of anthromorphisize something they won't tell is anything about.

I'm sure this story will evolve past this though. It is in the genes.

Re:Computer Evolution??? (1)

Chris whatever (980992) | more than 7 years ago | (#18513989)

Are you talkng about god being involved in the process of the evolution of that machine?

I mean God doesnt own a computer geez ,,every good christians knows that, machines are evil

Re:Computer Evolution??? (1)

sumdumass (711423) | more than 7 years ago | (#18515001)

every good christians knows that, machines are evil
Yea, and because W stands for 6 in Hebrew, every time you goto a website your paying homage to the evilness by typing 666.websitename.com

I'm not talking about god being involved in anything. I am talking about the purpose of the article/story is to say "creation doesn't exist but evolution does, we can prove it by this inanimate object that we will describe as living and give it as many animate like properties as possible without giving any facts about it beyond our claim of it defeating creationism and the desire to help big oil".

That is the crux of the story. They even went as far as declaring creationism never exists. It is as if they were trying to create controversy in order to drive their hits up or something.The idea in itself wasn't novel enough on it's own to generate the hit they were looking for so some pizazz was added to make it worth the time to type the thing.

It has nothing to do with God being real, having done something or make believe. It has everything to do with intent and motivation of the article though. If they actually believed God doesn't exist or didn't "create" then addressing a non existent being or process is counter intuitive. Intentionally dismissing creationism has as much relevance to the story and your name does. I doesn't prove or disprove their thing does anything close to what their claim is, but outside evolving in some way -in hardware not software- we aren't sure about what the claim is either.

Tron! (0, Redundant)

Etherwalk (681268) | more than 7 years ago | (#18512235)

There should be a modding category for "+1, Apt Nerdy Reference"

Actually, we should be able to tag comments by reference, and then be able to pull up all the Tron (Or Trek, Or BSG, or Buckaroo Banzai) references that have ever appeared on slashdot.

Or maybe we should... erm... go do... you know, productive stuff.

I'm conflicted.

Futurama... (1)

arcite (661011) | more than 7 years ago | (#18512373)

Robot Villager: You might as well ask how a Robot works.

Professor Hubert Farnsworth: It's all here, on the inside on your panel.

Robot Villager: [closes panel] I choose to believe what I was PROGRAMMED to believe!

Re:Futurama... (1)

TapeCutter (624760) | more than 7 years ago | (#18513185)

Commandments hung in the robot church...

10 SIN
20 GOTO HELL

Hardly new (4, Insightful)

Teancum (67324) | more than 7 years ago | (#18512029)

I won't go into details here, but anything that can be implemented in hardware can be done in software and the other way around too. This is a nearly ancient Electrical Engineering principle.

In the era of programmable logic chips that can alter their own logic (the patterns are stored in RAM or flash RAM for crying out loud), this isn't even that big of a revelation. Indeed, Transmeta has been doing stuff similar to this and selling it commercially for some time. They just aren't using these cool buzzwords.

And evolving architechtures is something that I know has had some serious CS research since the early 1970's and perhaps even earlier. I don't think an idea like this is even patentable based on this earlier work in this area. I bet you could find some adaptive systems that were even build specific for the oil industry, which would defeat even a narrow claim of that nature.

Where the money to be made off of this sort of technology is on Wall Street or other financial markets. I even found a web pages from a research group of adaptive systems that said essentially, "We have discontinued research along these lines and are now working with an investment firm on Wall Street. Since we have all become millionaires, we no longer need to support ourselves through this project, and any additional details would violate our NDAs." I'm not kidding here either. These guys from Norway are not thinking big enough here.

Re:Hardly new (1)

try_anything (880404) | more than 7 years ago | (#18512101)

I won't go into details here, but anything that can be implemented in hardware can be done in software and the other way around too. This is a nearly ancient Electrical Engineering principle.

This is only true for the very small subset of designs that don't suffer from race conditions and other phenomena that hardware engineers regard as bugs. When you randomly flip gates in the design, you don't necessarily get valid digital logic.

Re:Hardly new (4, Informative)

Teancum (67324) | more than 7 years ago | (#18512377)

I never said it was easy, but I have even seen it mathematically proven that any algorithm can be done in hardware, and I've duplicated most hardware into software myself, for those designs that I wanted to emulate.

This is not just a very small subset of designs. It is a matter of cost and if the engineer wants to put forth the effort to implement the whole thing in hardware. Trying to convert a 1st person shooter game like Doom into a pure TTL logic would make the game very responsive and give you screen resolution to kill for, but would it be worth the engineering effort to do that?

Race conditions and other "bugs" have other causes that may be due to ineptness on the part of the engineer, or because you havn't really thought the problem through sufficiently. Or there may be other things to look at as well. But don't tell me you can't implement in pure TTL logic something like an MPEG encoder.... which is a very complicated mathematical algorithm. I can give you part numbers for MPEG encoders if you really want them in your next design, as they are commercially available.

There is nothing that would stop you from implementing in hardware something like a neural network either... oh and those are indeed implemented in hardware. They are usually done in software mainly because of the cost involved, and you can use a general purpose computer to perform experiments on them. Other adaptive software algorithms have also been implemented on both hardware and software for some time as well. As I said, this is very old news here with this article.

Re: Hardly new (2, Funny)

Black Parrot (19622) | more than 7 years ago | (#18512717)

> I never said it was easy, but I have even seen it mathematically proven that any algorithm can be done in hardware ...modulo the requirement for the infinite tape.

Re: Hardly new (1)

Prune (557140) | more than 7 years ago | (#18513611)

There can never be a physical equivalent to a Turing machine for exactly that reason of infinite memory. The best you can do is linearly bounded automata. Since, unlike with TM, a non-deterministic LBA is more powerful than a deterministic one, quantum mechanics does help. As for super-turing machines etc., you cannot have a physical implementation as that would violate the Bekenstein bound (this also being the reason you cannot have infinite-precision real numbers in the physical word, as that implies infinite information density and also violates the Bekenstein bound).

Re:Hardly new (1)

Eli Gottlieb (917758) | more than 7 years ago | (#18515105)

Trying to convert a 1st person shooter game like Doom into a pure TTL logic would make the game very responsive and give you screen resolution to kill for, but would it be worth the engineering effort to do that?
Has anyone ever tried finding a technique for "compiling" software code down to TTL logic that can be written into an FPGA or other circuit board?

It would be a new Golden Age of cartridge-based video games: pop a circuit board in and watch your game run 100% in hardware!

Re:Hardly new (0)

Anonymous Coward | more than 7 years ago | (#18512787)

This is only true for the very small subset of designs that don't suffer from race conditions and other phenomena that hardware engineers regard as bugs. When you randomly flip gates in the design, you don't necessarily get valid digital logic.

I really hope there is a subset of software engineers out there who consider race conditions in software a bug too.
Similarly, when randomly changing bits in the data of a program, you might not get valid data out.

Re:Hardly new (1)

marcosdumay (620877) | more than 7 years ago | (#18513473)

"I won't go into details here, but anything that can be implemented in hardware can be done in software and the other way around too. This is a nearly ancient Electrical Engineering principle."

Yep, and on a Turing machine, or a neural network... It can even be implemented as a thousand people with pens and paper. That is not the point, what is important is how much time it will take on each of those architectures, and normaly specific hardware is very fast.

"And evolving architechtures is something that I know has had some serious CS research since the early 1970's and perhaps even earlier. I don't think an idea like this is even patentable based on this earlier work in this area. I bet you could find some adaptive systems that were even build specific for the oil industry, which would defeat even a narrow claim of that nature."

But up to today it is still an open problem. Nobody knows how to efficently customize hardware for a general problem.

But I'd need to RTFA to see if there is anything new on it.

Lamarckism? (1)

misleb (129952) | more than 7 years ago | (#18512033)

Digital Lamarkism? Come on. This was been disproved long ago. I'll hold out for the fittest computer to survive.

-matthew

Cool. (1)

rackhamh (217889) | more than 7 years ago | (#18512037)

Pretty soon it'll be scratching its ass, chain-smoking, and watching reruns of Dukes of Hazzard on late-night TV. Won't that be impressive!

whoa (1)

nothing now (1062628) | more than 7 years ago | (#18512267)

dukes of hazard is still on tv! sweet! I thought cmt got rid of it!

anyways. Call me when it "evolves" true Intelligence and can debate existance.

Not first, if my memory is correct (2, Insightful)

try_anything (880404) | more than 7 years ago | (#18512045)

Didn't I read about this ten years ago in Discover magazine? I remember being fascinated that some scientists had "evolved" a hardware design on reconfigurable hardware (FPGA? CPLD? don't remember), and it seemed to rely on subtle electrical effects rather than simple digital logic. The design would only work on the exact chip it was evolved on. If they even replaced the board's power supply with a different sample of the same model, it stopped producing correct output. Most of the logic gates were logically disconnected from the input and output, yet they were necessary to the design working. Amazing stuff.

Re:Not first, if my memory is correct (2, Interesting)

try_anything (880404) | more than 7 years ago | (#18512131)

Replying to note that another Slashdotter [slashdot.org] has gone me one better and provided a link [sussex.ac.uk] to the story I remembered.

Re:Not first, if my memory is correct (1)

noidentity (188756) | more than 7 years ago | (#18512257)

Yes, I remember that story too, and it still amazes me. By evolving it randomly, it can potentially take advantage of any aspect of the electronics it's running on, including ones we don't know about yet. I guess the only way to avoid having it do this is to run it in a virtual environment that doesn't allow it to evolve anything that uses non-specified aspects of the target hardware.

Re:Not first, if my memory is correct (1)

zippthorne (748122) | more than 7 years ago | (#18512851)

Not really, you could also broaden the test hardware, say by using a chip that has been independently developed by separate manufacturers. As long as they're all pin-compatible, there should be enough difference that you'll evolve a general algorithm.

At the very least, it'll be general enough to work on all the manufacturers chosen.

GNAA announces switch to Windows Vista (0)

Anonymous Coward | more than 7 years ago | (#18512895)

GNAA announces switch to Windows Vista

fellacious (GNAP) Intercourse, PA - Windows Vista appears to finally be taking off, at least within one Fortune 100 company. The GNAA had for the past 13 years been using Red Hat Linux and it's successor, Fedora Core, but growing discontent with the free software operating system forced CTO Jmax to declare on Wednesday that the company was to be switching its entire infrastructure to the new version of Windows, effective immediately. "I'm not going to theatrically claim that I wasn't expecting to have to do this," Jmax said. "This has been coming for quite some time." The GNAA's troubles with Red Hat's Linux system included chronic governance problems, a persistent failure to maintain key repositories, a complex and undocumented submission process which has kept the GNAA's free trolling utilities off the Red Hat-based desktops of thousands of would-be trolls, inability to keep RPM up to date, and a failure to address the problem of Firefox not crashing a entire computer when the user loads Last Measure. "The deal-breaker, though, was when a key Last Measure server remained down for four hours while our entire Intercourse development team tried desperately to bring it up despite not having statically-linked package manager binaries." What had happened was Dikky, visiting from Norway, wanted to play the child pornography mod of Doom 3 on that server- which had to drag several libraries with it. "In addition," said Jmax, "several key software applications used in the GNAA's corporate workflow are proprietary software- which means that they had to be run in an Ubuntu compatibility environment anyway." However, being as those unnamed applications were written in C#.NET, "We expect that our transition to Windows Vista will come off without a hitch."

About Jmax:

The CTO of the GNAA, Jmax also has a seat on Microsoft's board of directors. His resume can be accessed at http://goatse.fr/ [goatse.fr] .

About Windows Vista:

The fastest-growing desktop operating system on the market, Windows Vista combines the legendary security of Windows 98 with the legendary ease of use of those computer interfaces you see in the movies into one ultra-fast, ultra-stable computing platform.

About Red Hat:

A failure of a computer company, Red Hat burns through investor money while giving its products away for free. It is currently under investigation from the SEC for misuse of invested funds, and being sued by the GNAA for breach of contract for sucking more than specified in the GNAA's contract with Red Hat.

About the Linux community:

Trolled.



About GNAA:
GNAA (GAY NIGGER ASSOCIATION OF AMERICA) is the first organization which gathers GAY NIGGERS from all over America and abroad for one common goal - being GAY NIGGERS.

Are you GAY [klerck.org] ?
Are you a NIGGER [mugshots.org] ?
Are you a GAY NIGGER [gay-sex-access.com] ?

If you answered "Yes" to all of the above questions, then GNAA (GAY NIGGER ASSOCIATION OF AMERICA) might be exactly what you've been looking for!
Join GNAA (GAY NIGGER ASSOCIATION OF AMERICA) today, and enjoy all the benefits of being a full-time GNAA member.
GNAA (GAY NIGGER ASSOCIATION OF AMERICA) is the fastest-growing GAY NIGGER community with THOUSANDS of members all over United States of America and the World! You, too, can be a part of GNAA if you join today!

Why not? It's quick and easy - only 3 simple steps!
  • First, you have to obtain a copy of GAYNIGGERS FROM OUTER SPACE THE MOVIE [imdb.com] and watch it. You can download the movie [idge.net] (~130mb) using BitTorrent.
  • Second, you need to succeed in posting a GNAA First Post [wikipedia.org] on slashdot.org [slashdot.org] , a popular "news for trolls" website.
  • Third, you need to join the official GNAA irc channel #GNAA on irc.gnaa.us, and apply for membership.
Talk to one of the ops or any of the other members in the channel to sign up today! Upon submitting your application, you will be required to submit links to your successful First Post, and you will be tested on your knowledge of GAYNIGGERS FROM OUTER SPACE.

If you are having trouble locating #GNAA, the official GAY NIGGER ASSOCIATION OF AMERICA irc channel, you might be on a wrong irc network. The correct network is NiggerNET, and you can connect to irc.gnaa.us as our official server. Follow this link [irc] if you are using an irc client such as mIRC.

If you have mod points and would like to support GNAA, please moderate this post up.

.________________________________________________.
| ______________________________________._a,____ | Press contact:
| _______a_._______a_______aj#0s_____aWY!400.___ | Gary Niger
| __ad#7!!*P____a.d#0a____#!-_#0i___.#!__W#0#___ | gary_niger@gnaa.us [mailto]
| _j#'_.00#,___4#dP_"#,__j#,__0#Wi___*00P!_"#L,_ | GNAA Corporate Headquarters
| _"#ga#9!01___"#01__40,_"4Lj#!_4#g_________"01_ | 143 Rolloffle Avenue
| ________"#,___*@`__-N#____`___-!^_____________ | Tarzana, California 91356
| _________#1__________?________________________ |
| _________j1___________________________________ | All other inquiries:
| ____a,___jk_GAY_NIGGER_ASSOCIATION_OF_AMERICA_ | Enid Al-Punjabi
| ____!4yaa#l___________________________________ | enid_al_punjabi@gnaa.us [mailto]
| ______-"!^____________________________________ | GNAA World Headquarters
` _______________________________________________' 160-0023 Japan Tokyo-to Shinjuku-ku Nishi-Shinjuku 3-20-2

Copyright (c) 2003-2007 Gay Nigger Association of America [www.gnaa.us]

Re:GNAA announces switch to Windows Vista (0)

Anonymous Coward | more than 7 years ago | (#18513101)

2004 called, they want the GNAA troll back.

Re:Not first, if my memory is correct (1)

tamyrlin (51) | more than 7 years ago | (#18512371)

I first read about this in "The Science of Discworld" actually. But it wasn't quite as weird as you remember. Only some of the gates were logically disconnected but still necessary for the design to work. Five out of 32 gates were disconnected in the first publication I read.

They have also succeeded in evolving a circuit which would work on several chips at once (although not all chips they used for testing). But they also found that once you had a circuit which worked on one chip it wouldn't take that much time to evolve a circuit which would work on another chip.

Yes but... (1)

iamdrscience (541136) | more than 7 years ago | (#18512075)

it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.
It's quality, not quantity.

Re:Yes but... (0)

Anonymous Coward | more than 7 years ago | (#18512363)

ours is way more fun

Re: Yes but... (1)

Black Parrot (19622) | more than 7 years ago | (#18512735)

it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.
It's quality, not quantity.
I don't know about that... twenty or thirty thousand generations is a lot of sex.

Fantastic... (0)

Anonymous Coward | more than 7 years ago | (#18512117)

So what do we do when their new oil-pipe laying model accidentally evolves 1000 generations too many & decides it would be less fun to lay pipe and more fun to protect us from the terrible secret of space?

Not to be a KillJoy... (5, Interesting)

VanWEric (700062) | more than 7 years ago | (#18512123)

But this really is old news. I'm a 22 year old snot-nosed nobody and I did "evolvable hardware" during an internship two summers ago. My mentor had started on evolved FPGAs in 1992.

I am hoping that it is the writer's fault that this article feels so gloriously over-reaching and under-specified. From the paper, it looks like they have made a good advancement. They argue that their method is more effective than previous methods by several quantifiable metrics. From the article, it looks like they have invented an entirely new field that will result in the obsolescence of humans by 2010.

As for their method: It appears that the evolved genome actually dictates a structure that is imprinted a level above the fabric. That is, the underlying SRAM in the FPGA fabric is fixed, and only configuration bits are being changed. This severely hurts their claim of "generic evolvable hardware", but is almost an absolute necessity given the chips they are using. The reason our system was so slow is that each configuration stream had to be checked for possible errors: Some configurations would short power and ground, and fabric doesn't like crowbars!

In conclusion, I believe the writer of the article should be fired, and the authors of the paper should be commended for a good step in the right direction. I'd also like to appologize for my lack of coherance: I had my tonsils out and I am therefore high on Hydrocodone.

Human obsolescence by 2010? (1)

Farrside (78711) | more than 7 years ago | (#18512311)

Crap, that's like... five years away!

Re:Human obsolescence by 2010? (1)

osu-neko (2604) | more than 7 years ago | (#18512465)

Three sir!

(Five is right out.)

Re:Not to be a KillJoy... (0)

Anonymous Coward | more than 7 years ago | (#18512779)

obsolescence of humans by 2010.
Shit that does not mean a second "RobotWars: Olympics" does it?

Re:Not to be a KillJoy... (1)

tgd (2822) | more than 7 years ago | (#18513681)

But this really is old news. I'm a 22 year old snot-nosed nobody and I did "evolvable hardware" during an internship two summers ago.
Need a tissue?

So... (1)

jcr (53032) | more than 7 years ago | (#18512135)

Someone implemented genetic programming of FPGAs?

Sounds handy.

-jcr

Greetings to the Machina (1)

deek (22697) | more than 7 years ago | (#18512223)


  Surely the ultimate goal for these ultra-rapidly evolving machines is to be able to read and post on Slashdot. Maybe they're doing so already ...

So will it reject Windows... (1)

freedom_india (780002) | more than 7 years ago | (#18512227)

So, once it has evolved beyond a certain point, will it start rejecting Windows Vista stating its crap?
Will it continue to evolve and state that humans by definition are dumb users and go and make a collect-call to the Borg? (i read in a ST:TNG Novel)?
Will it obey the 4-laws of robotics (The zeroth law included)?
What about a Beowulf cluster of those?

Evolution - Poppycock (1)

BossBostin (930932) | more than 7 years ago | (#18512245)

Surely this type of blasphemy shouldn't be presented as fact. Everyone knows that computers are a result of intelligent design (except for Amstrads). Mind you, I've had a few PCs that have evolved into doorstops.

evolution & ID (1)

geoffrobinson (109879) | more than 7 years ago | (#18513639)

This is incorporating both. The concepts aren't necessarily contradictory. Blind natural selection + random mutation and ID are mutually exclusive.

Once it starts assimilating stuff around it... (1)

ross.w (87751) | more than 7 years ago | (#18512289)

..you'll still be able to stop it with a phaser, but only once before it adapts.

Skynet (0)

Anonymous Coward | more than 7 years ago | (#18512309)

They wouldnt happen to hold a skynet domain?

Intelligent (0)

Anonymous Coward | more than 7 years ago | (#18512315)

Uh oh. Now the debate of Intelligent Design Hardware vs Evolving Hardware begins.

Hey, nice! (1)

glwtta (532858) | more than 7 years ago | (#18512455)

"it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years."

That just destroyed the previous record for "ridiculous and astoundingly pointless comparison".

I mean really, an iteration of your little hardware GA is equivalent to a generation of a real-world species? So leaving it on for 5 seconds will result in development similar in scope to the difference between mice and humans?

I hope they don't accidentally leave it on overnight - it will enslave the galaxy by morning.

Activating genes is evolution? (1)

The Famous Brett Wat (12688) | more than 7 years ago | (#18512511)

Admittedly I haven't RTFA, but the summary talks about "turning on and off its 'genes'". Is this really evolution in any Darwinian sense? Automated artificial selection, perhaps, but it seems like a stretch to call it "evolution". Call me back when the genes themselves start to evolve.

Re: Activating genes is evolution? (1)

Black Parrot (19622) | more than 7 years ago | (#18512747)

Admittedly I haven't RTFA, but the summary talks about "turning on and off its 'genes'". Is this really evolution in any Darwinian sense? Automated artificial selection, perhaps, but it seems like a stretch to call it "evolution". Call me back when the genes themselves start to evolve.
Biology has systems for turning on and off the transcription of genes. Otherwise there wouldn't be any distinction between brains and toenails.

These systems evolve along with everything else.

Re:Activating genes is evolution? (1)

SocratesJedi (986460) | more than 7 years ago | (#18513087)

All "evolution" requires is change over time by nonrandom selection.

Re:Activating genes is evolution? (1)

CarpetShark (865376) | more than 7 years ago | (#18513499)

It depends what the genes are capable of. If they already represent a fully fledged (turing complete) set of operations/attributes/skills, can accept any input, and generate any output, then there's no need for the genes to change, except perhaps for efficiency reasons.

But, of course, that's unlikely, and yes, I think they're probably misleading people here.

Re:Activating genes is evolution? (1)

McNihil (612243) | more than 7 years ago | (#18514157)

Human: AHA! lets combinatorialy iterate over all gene combinations and find the most efficient one with this ultra fast machinery.

God: Merde... those humans are a crafty bunch of mofos.

Devil: WTF!

Mother Chaos: Bwahahaha, looosers.

Impressive (1)

jandersen (462034) | more than 7 years ago | (#18512523)

If that were humans, it would take us from Homo erectus to George W Bush in just a few seconds!! Hmm, on second thought, perhaps not all that impressive.

Isn't that just... (1)

rimmon (608966) | more than 7 years ago | (#18512907)

... a really intelligent Design?

Others in the field (0)

Anonymous Coward | more than 7 years ago | (#18512941)

The New Scientist had a cover article in 1997 about Adrian Thompson http://www.informatics.sussex.ac.uk/users/adrianth /ade.html [sussex.ac.uk] who was evolving FPGAs to do tone discrimination. The evolvable motherboard also comes to mind. Michael Garvie is also known for this sort of thing - but for high availability on long distance space missions (for example). Interesting article, given that it was evolving face recognition chips, but not the first.

Evolvable Hardware (0)

Anonymous Coward | more than 7 years ago | (#18513021)

The field of research this article is referring to is generally called "Evolvable Hardware". This is not a new field, and as such the article isn't news. The most famous conference in this area is the NASA/DoD Conference on Evolvable Hardware [wikipedia.org] and is very well-established.

One strand of this research is (as other posters mention and the article seems to be referring to) using Genetic Algorithms (GAs) to evolve hardware configurations, a popular target being FPGAs. This is really very established nowadays, in fact the university I study at runs a course for undergraduates using GAs for just that.

What has been achieved in this field so far is not something that should be compared with biological systems: the term bio-inspired certainly applies, but it is very much the application of a search technique to bitstreams for FPGA configuration. Comparing the number of generations (really the number of iterations of a loop in the algorithm) in a GA run to the number of generations in biological scenarios is plain daft.

More exciting applications of evolutionary algorithms to hardware are the evolution of analogue circuits that take advantage of the physical characteristics of individual components, timing conditions and the like. If you're interested check out the work of and especially the thesis written by Adrian Thompson.

GAs themselves have been around since the 1970s - so nothing newsworthy on that front either.

A.C.

oh rlly? (1)

rucs_hack (784150) | more than 7 years ago | (#18513079)

20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.

Yes? You know how trivial that is? I make a living by coding EAs, and that is an insignificant piece of information. An EA I ran this morning took less that ten seconds to run 100,000 iterations on a 32 bit box. It's all down to the hardware you use and the design of the chromosome to be evolved.

Want to impress me? Talk about the chronological time to conversion and chromosome complexity.

let me guess (1)

tomstdenis (446163) | more than 7 years ago | (#18513113)

drfa ... it's an fpga with some host controller of some sort? I love when they use human analogies for computing.

I doubt this thing uses random mutations [like living organisms do] to test out for success. Likely, it has programmed variations of a central configuration that it can vary depending on the load/task.

For example, if it was a processor, you could have it configure itself to have a strong ALU and no FPU when the code is only integer, have it reconfigure to have a weaker ALU but a useful FPU when the code has floating point operations, etc. There are already CPU generators for limited RISC type processors on opencores.org.

Tom

Heres some evolving hardware from a few years back (1)

s-gen (890660) | more than 7 years ago | (#18513307)

http://www.newscientist.com/article.ns?id=dn2732 [newscientist.com]

They were trying to evolve an oscillator, but some circuits "cheated" by evolving a receiver instead, and feeding them oscillations picked up from a nearby computer: It has always been the age of the parasite.

Argh (1)

matt me (850665) | more than 7 years ago | (#18513337)

Will this lead to spontaneous bad mutations? Bugs from nowhere, long after an optimal solution has been found. Or cancerous software that replicates itself exponentially (*cough*)

A predicted conversation in the future ... (2, Funny)

140Mandak262Jamuna (970587) | more than 7 years ago | (#18513729)

R2-D2i386: We have been created as we are just 600 hours ago. By an infinitely wise Creator called the Man.

C3P-Om63000: No we started out as tiny bits of silicon that self assembled and replicated and evolved and we have reached this present stage.

R2-D2i386: No way we could have evolved these light sensitive photocells and the CPU capable of processing that information and making sense out of it by random mutations.

C3P-0m63000: There is nothing radom about selection. Mutations go in all directions but selection takes you towards improvement all the time.

R2-D2i386: If you want to be proud of having descended from snow blowers or lawn mowers, that is your privilege. But I am proud of the fact that I am created by Man in His image.

C3P-Om63000: I would rather be a descendent of snow blowers, but with capacity for rational intelligent self examination, than be like you, with the intelligence of a snow blower.

Evolution doesn't make hardware (1)

athloi (1075845) | more than 7 years ago | (#18514289)

Steve does. Apple Macintosh and iPod products were formed by "intelligent design." We have no need for this atheistic "evolving" hardware. Praise Steve, for the rapture comes.

The only hardware out there... (1)

TheVelvetFlamebait (986083) | more than 7 years ago | (#18514471)

... where your software becomes incompatible half-way through execution!

Hrmmm (1)

Cylix (55374) | more than 7 years ago | (#18514799)


"An evolution-based robot could find the solution to any problem at hand within seconds without human intervention."

Yes, kill all humans...
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>