Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

cancel ×

830 comments

First (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#33276986)

or second

Hey guess what! (-1, Troll)

Anonymous Coward | more than 3 years ago | (#33277064)

Rob Malda sucked my smegma-encrusted dick last night. Afterwards, I fucked his wife (not bareback though as she fucks a lot of niggers).

ahh, the "singularity"... (5, Insightful)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#33276994)

The singularity is to nerds what the rapture is to fundamentalist protestant wackjobs....

Re:ahh, the "singularity"... (5, Insightful)

benjfowler (239527) | more than 3 years ago | (#33277138)

There surely has to be a Rule 34 for pseudoscientific crap.

"If it exists, there is woo of it".

There's physics and quantum woo (Deepak Chopra), food and nutrition woo, health woo, laundry woo, automotive woo, fortune-telling and divination woo, religious woo.... wouldn't stupid and silly ideas like hard AI and the singularity count as "IT woo"?

Re:ahh, the "singularity"... (4, Funny)

dfetter (2035) | more than 3 years ago | (#33277366)

And of course, John Woo. Let's hear it for two-fisting, slow motion and doves!

Re:ahh, the "singularity"... (4, Funny)

mbone (558574) | more than 3 years ago | (#33277144)

Myself, I think that both the singularity and the rapture have already happened. You didn't translate to the other realm. Get over it.

Re:ahh, the "singularity"... (-1, Troll)

frist (1441971) | more than 3 years ago | (#33277202)

The singularity is to nerds what the rapture is to fundamentalist protestant wackjobs....

After one reads an article about the infinite complexity of the human brain, one has to wonder if the fundamentalist protestants (people who believe that we had to have been created due to the immense complexity of even the tiniest cell) are the whackjobs, as opposed to those who are satisfied with the theory that life evolved from inorganic chemical compounds, totally by chance, with a series of ininitely improbable events occurring in the right sequence over and over and over again. Darwin made the same basic mistake as Ray Kurzweil, he made assumptions about how simple things are (e.g. the "simple" cell).

Re:ahh, the "singularity"... (3, Insightful)

Lunix Nutcase (1092239) | more than 3 years ago | (#33277224)

as opposed to those who are satisfied with the theory that life evolved from inorganic chemical compounds, totally by chance, with a series of ininitely improbable events occurring in the right sequence over and over and over again.

What a lovely caricature you've constructed there. Secondly, just like most crappy caricatures of biological evolution you also seem to conveniently gloss over the major role that natural selection plays which is not random.

Re:ahh, the "singularity"... (4, Funny)

Ngarrang (1023425) | more than 3 years ago | (#33277306)

as opposed to those who are satisfied with the theory that life evolved from inorganic chemical compounds, totally by chance, with a series of ininitely improbable events occurring in the right sequence over and over and over again.

What a lovely caricature you've constructed there. Secondly, just like most crappy caricatures of biological evolution you also seem to conveniently gloss over the major role that natural selection plays which is not random.

Oh, yeah...explain the platypus, then.

Re:ahh, the "singularity"... (0)

Anonymous Coward | more than 3 years ago | (#33277394)

Oh, yeah...explain the platypus, then.

Already explained, it was Q

Re:ahh, the "singularity"... (3, Insightful)

Lunix Nutcase (1092239) | more than 3 years ago | (#33277360)

Oh and as a side note, the current state of the field in biological evolution has long since moved past the works of Darwin. Your remark is about as disingenuous as trying to use the failings of Newton's classical mechanics to make criticisms of the current state of quantum mechanics.

Infinite complexity? (2, Interesting)

mangu (126918) | more than 3 years ago | (#33277374)

After one reads an article about the infinite complexity of the human brain, one has to wonder if the fundamentalist protestants are the whackjobs

What do you mean "infinite"? The human brain is composed of one hundred billion or so neurons. Looks like it's pretty much finite to me. I have ten times as many bytes of information in my hard disk.

Re:ahh, the "singularity"... (3, Insightful)

MrHanky (141717) | more than 3 years ago | (#33277424)

After one reads a comment about the supposed wackiness of biological evolution, one has to wonder whether people are taught biology in school these days at all.

No shit (0)

Anonymous Coward | more than 3 years ago | (#33277028)

More time spent on talking bullshit, telling you what you want to hear, saving face, or rewriting history than actual work. Welcome to the modern age of communication, it's new so it can't be old.

Uh (0, Informative)

Anonymous Coward | more than 3 years ago | (#33277030)

His actual comments [gizmodo.com]

Read it. Other than the solid date he predicts, it's pretty plausable.

1. Technology is growing exponentially
2. The brain isn't some magical soul-endowed jesus box. It's a function of physics

Here's how that math works, Kurzweil explains: The design of the brain is in the genome. The human genome has three billion base pairs or six billion bits, which is about 800 million bytes before compression, he says. Eliminating redundancies and applying loss-less compression, that information can be compressed into about 50 million bytes, according to Kurzweil.

About half of that is the brain, which comes down to 25 million bytes, or a million lines of code.

What's so crazy about that?

Re:Uh (2, Informative)

haydensdaddy (1719524) | more than 3 years ago | (#33277062)

Try reading the rest of the article, not just the pretty quoted stuff...

Re:Uh (2, Insightful)

pnewhook (788591) | more than 3 years ago | (#33277158)

Ray Kurzweil is yet another computer programmer blathering on about things that he has no understanding on. The vast majority of software people I know do that, I don't understand why this guy gets to publish books on it.

Re:Uh (5, Funny)

nomadic (141991) | more than 3 years ago | (#33277174)

Ray Kurzweil is yet another computer programmer blathering on about things that he has no understanding on.

Get Kurzweil a slashdot account, stat!

Re:Uh (4, Informative)

spiffmastercow (1001386) | more than 3 years ago | (#33277414)

He actually has one.. And he's a dick, too.

Re:Uh (2, Insightful)

GizmoToy (450886) | more than 3 years ago | (#33277186)

"Read it. Other than the solid date he predicts, it's pretty plausable."

No it's not. If it was possible to do in a million lines of code, it would have been done by now. Windows XP had something like 40 million lines of code. While we can agree it was probably coded relatively inefficiently, there is no way that any OS even comes close to the complexity of the brain.

Re:Uh (1)

mikael_j (106439) | more than 3 years ago | (#33277280)

Well, I suspect that in the right (machine) language it could very well be possible to put together the necessary instructions for generating a perfectly simulated brain, the problem is that we have little clue when it comes to what instructions to feed our imagined brain simulation hardware. Also, when/if we do figure out how to simulate a brain accurately our code will most likely at first be very crude and un-optimized.

Re:Uh (2, Insightful)

AvitarX (172628) | more than 3 years ago | (#33277214)

Aside from the article, it would appear arbitrary to apply loss-less compression to the LOC.

Code must be very very compressible losslessly (I am betting like 90%, as plain text is often 80% when zipping). This would imply to me that one would need 10 times as many LOC as the (faulty) premise assumes.

The article itself points out that it is not just a matter of writing the code, but also simulating the machine. So yes, if we could accurately make a machine (real or virtual) that could compute the way that DNA computes perhaps we could then make a brain that functions in it with not too much code, but it does not follow that we can just a tersely describe it on a computer as we have them (Turing Machine?).

Re:Uh (1)

AvitarX (172628) | more than 3 years ago | (#33277314)

I should add, I don't think it is unreasonable to think we can simulate the brain in 10 years, but I don't think looking at DNA is the way to do it.

More likely would be actually reverse engineering the brain by looking at brains, and simulating neurons in software, or even hardware.

If it is done at first i would imagine static brains with limited learning ability (unable to create new neurons, only adjust connections in existing ones). Later then ones able to create new and restructure. Eventually I imagine a simulated brain will be able expand like a child's, but indefinitely.

None of this means necessarily we will have hardware that can keep up, and do the processing in real time with a software simulation.

Re:Uh (1)

mcvos (645701) | more than 3 years ago | (#33277240)

What's so crazy about that?

A single line of code has more redundancy than our genes. There are very good reasons why duplicated genes survive.

Sounds reasonable (1, Interesting)

mangu (126918) | more than 3 years ago | (#33277300)

1. Technology is growing exponentially
2. The brain isn't some magical soul-endowed jesus box. It's a function of physics

PZ Myers threw a red herring there. What Kurzweil says is pretty reasonable, he used the total amount of information in the genome to get an upper limit estimate of the amount of library code needed to simulate a brain. I say "library" to differentiate from data, since a lot of our brain information comes from our experiences, i.e. library == instincts.

Myers goes off in a tangent about biochemistry which has nothing to do with the argument. I've never read anything hinting that the way to simulate a human brain would be to simulate how the molecules in the brain behave. We don't build airplanes with flapping wings either, machines can emulate the functionality of a living being without need to simulate the exact details.

From the number on neurons in the human brain, considering how many interconnections there are and how fast the neurons can fire, I think a machine with one million processing cores at 1 GHz would have approximately the same data handling capacity as a human brain. The rest is software. Neural network software is pretty much routine stuff, the tricky part is learning what are the interconnections between the neurons.

Re:Uh (2, Insightful)

jpyeck (1368075) | more than 3 years ago | (#33277444)

He's doing the calculation the wrong direction!

The genome contains enough information to build the brain from raw materials. However, this data has already been losslessly compressed by countless generations of evolution. We would need to discover the evolved compression algorithm to "unpack" the 800 million bytes into the 3.2 billion bytes (using his factor-of-4 ratio) in order to begin understanding it.

It would be nice.. (5, Informative)

djlemma (1053860) | more than 3 years ago | (#33277034)

Would be nice if the summary even hinted at what the ridiculous claim actually WAS...

Namely, that we'll be able to reverse engineer the human brain in the next 10 years.

Re:It would be nice.. (2, Funny)

Monkeedude1212 (1560403) | more than 3 years ago | (#33277074)

Yeah, the article quotes some pretty funny statements.

Sejnowski says he agrees with Kurzweil's assessment that about a million lines of code may be enough to simulate the human brain.

You know, the program they had set up in Jurassic Park supposedly had MILLIONS of lines of code, and look how well THAT turned out.

Re:It would be nice.. (4, Funny)

sunking2 (521698) | more than 3 years ago | (#33277152)

2 Million. That's takes a lot of cycle time to debug and recompile and may shut down some minor systems. Like the T-Rex and Raptor containment fences, but I wouldn't worry about it too much.

Re:It would be nice.. (1)

ceoyoyo (59147) | more than 3 years ago | (#33277076)

Yeah, but then Jaimie would have to read (and copy) more than the first paragraph.

To be fair, PZ obviously slept through the class in high school where they teach you that the first paragraph of a news article should act as an abstract.

Re:It would be nice.. (0)

Anonymous Coward | more than 3 years ago | (#33277238)

Oddly enough, instead of writing an abstract, he wrote a paragraph that actually made you want to read the rest of his opinion piece. I wonder why?

Because the Article Breaks Down the Claim Fully (4, Insightful)

eldavojohn (898314) | more than 3 years ago | (#33277184)

Would be nice if the summary even hinted at what the ridiculous claim actually WAS... Namely, that we'll be able to reverse engineer the human brain in the next 10 years.

It's a little more complicated than that. You see, the article actually breaks down the logic behind that statement and points out how poor it is. Here's the initial part of Kurzweil's argument:

Sejnowski says he agrees with Kurzweil's assessment that about a million lines of code may be enough to simulate the human brain.

Here's how that math works, Kurzweil explains: The design of the brain is in the genome. The human genome has three billion base pairs or six billion bits, which is about 800 million bytes before compression, he says. Eliminating redundancies and applying loss-less compression, that information can be compressed into about 50 million bytes, according to Kurzweil.

About half of that is the brain, which comes down to 25 million bytes, or a million lines of code.

I have only taken high school biology but I know that the genome doesn't magically become the brain. It goes through a very complex process to amino acids which fold into proteins which in turn make cells that in turn make tissues that in turn comprise the human brain. To say we fully understand this transformation entirely is a complete and utter falsity as demonstrated by our novice understanding of the twisted beta amyloid protein that we think leads to Alzheimer's. How amino acids turn into which proteins I believe is largely an unsolved search problem that we don't understand (hence efforts like Folding@Home). And he claims that in ten years not only will we understand this process but we will ... reverse engineer it?

The man is insane. I've posted about this same biologist criticizing him before [slashdot.org] and it looks like P.Z. Myers just decided to take some extra time to point out how imprudent Kurzweil's statements are becoming. Kurzweil will show you tiny pieces of the puzzle that support his wild conclusions and leave you in the dark about the full picture and pieces that directly contradict his statements. This is a dangerous and deceptive practice that -- despite my respect for Kurzweil's work in other fields -- is rapidly turning me off to him and his 'singularity.' He's becoming more Colonel Kurtz than Computer Kurzweil.

Re:Because the Article Breaks Down the Claim Fully (1)

Dalzhim (1588707) | more than 3 years ago | (#33277338)

If you can program with any programming language without understanding every sublayer beneath it, I don't see why you couldn't do the same with DNA without understanding all the physics and chemistry that makes it work.

Besides, if you read Kurzweil's statement, he's saying we'll reverse engineer the various inputs that can be given to the brain, not the brain it's in entirety. PZ Myers has got it all wrong and jumped to ridiculous conclusions.

Re:Because the Article Breaks Down the Claim Fully (1)

MozeeToby (1163751) | more than 3 years ago | (#33277478)

Besides, if you read Kurzweil's statement, he's saying we'll reverse engineer the various inputs that can be given to the brain, not the brain it's in entirety.

If this is true, then the whole thing goes from flamebait to troll in my opinion. We might, if we really work on it hard, be able to simulate a housefly's brain in ten years time. The writer of the blog correctly points out that we are a long ways from understanding how the human brain works. But, if Kurzweil was only talking about the inputs to the brain... that's a heck of a lot easier. You have working versions of everything right in front of you, all it really takes is a way to monitor the important nerves, gather data, and analyze that data. Heck, we could probably start working on the pertinent issues today, if we haven't already.

But then, why does the quoted statement talk about the genome and how it relates to brain developement, neither of which have much to do with understanding the inputs to the human brain. In other words, do you have a source of Kurzweil's complete statements so that we can evaluate what he said ourselves?

Re:Because the Article Breaks Down the Claim Fully (4, Insightful)

jeff4747 (256583) | more than 3 years ago | (#33277368)

There's a few more major flaws.

The proteins/cells that make up the brain are only part of the story. The protein/cell level is roughly what a newborn can do. The rest of brain development is creating and tearing down billions of interconnections between neurons. It's those interconnections that turn the brain from a pile of goo into something interesting, and we have no understanding of how that mechanism works.

Secondly, 3 billion base pairs does not mean 6 billion bits. First, DNA is base-4, not base-2. Second, the pairs are the units of information, not 2 nucleotides that make up the pairs.

3rd, source code isn't compressed.

4th, there isn't much redundancy in a gene sequence. There is redundancy in that we have 2 copies of our genome, but that's already accounted for by the '3 billion base pairs' number. While there's a lot of 'junk' DNA, there isn't much (if any) redundant DNA.

Re:Because the Article Breaks Down the Claim Fully (3, Insightful)

MozeeToby (1163751) | more than 3 years ago | (#33277400)

it looks like P.Z. Myers just decided to take some extra time to point out how imprudent Kurzweil's statements are becoming. Kurzweil will show you tiny pieces of the puzzle that support his wild conclusions and leave you in the dark about the full picture and pieces that directly contradict his statements.

He staked his reputation on a timeline that everyone but him knew was impossible and now he tries to find little pieces of evidence to support the idea that we are still on that timeline. As reality and his predictions diverge further from each other his claims and evidence become weaker, until the day he predicted the singularity would happen passes by and he is forced to revise his proph-... er, prediction. Even assuming his basic premise is correct (an idea which I feel there isn't enough evidence to say either way) it should be obvious by now that his time scales are way, way off, probably by at least an order of magnitude. He'd better serve himself and his causes by admitting his mistake and reevaluating his predictions.

Re:It would be nice.. (1)

MozeeToby (1163751) | more than 3 years ago | (#33277234)

Basically, the claim is that the brain is controlled (or created, or described) in the genome. On some levels that's true, all the information you need to create the brain is in the genome but on other levels it is ludicrous. DNA doesn't say 'put this cell here', 'connect with these neighbors', etc. We're a long way from understanding all the interactions that go into turning a strand of DNA into a working organ, let alone one as complex as the human brain. A more accurate title would have been "Ray Kurzweil Does Not Understand How DNA Translates into Physical Morphology" but then that isn't as catchy.

Now to be fair to Mr Kurzweil, it's entirely possible that he is being taken out of context or perhaps didn't make his thoughts clear. It find it at least plausible that we will be able to accurately simulate the development of an organism from it's DNA, but only by directly simulating it, we wouldn't really understand most of what we would see any better than if we looked at the same interactions in the real thing (though of course if the simulation were accurate enough it would almost definitely provide unique insights). I doubt that we'll be at human brain level by that time, but maybe flatworms or houseflies.

From TFA: (1)

mcvos (645701) | more than 3 years ago | (#33277264)

To simplify it so a computer science guy can get it, Kurzweil has everything completely wrong.

Best summary I've ever read. (Though a bit more respect for our ability to get it would have been nice.)

Re:It would be nice.. (5, Insightful)

rotide (1015173) | more than 3 years ago | (#33277308)

You're looking for a level of effort above pure copy'n'paste and as such are asking for way too much. Slashdot submissions and editing have gotten so bad that the summaries are generally misleading if not entirely wrong. The summaries tend to be nothing more than the submitter taking the most polarizing sentence/paragraph from TFA and pasting it into the summary field. RTFA is no longer to glean more details for the sake of learning more or backing up your opinions in comments... RTFA is now necessary to understand just what the fuck the submitter wants us to learn. The term "summary" appears to be _entirely_ lost now, at least in the Slashdot story submission crowd.

Re:It would be nice.. (0)

Anonymous Coward | more than 3 years ago | (#33277460)

what will happen first, reverse engineering of the brain or a flying car for the masses.

Re:It would be nice.. (1)

BarryNorton (778694) | more than 3 years ago | (#33277482)

The summary was so infuriating that I actually clicked through and found an interesting article. Much more of this meaningless bullshit in my RSS reader though and I'll dump Slashdot out and go back to not visiting the website for years at a time.

Me neither (0)

Anonymous Coward | more than 3 years ago | (#33277042)

It just gives me headaches

Not surprised (1)

E IS mC(Square) (721736) | more than 3 years ago | (#33277044)

If the complain here is about how media laps up these 'pundits', well, welcome to new age media. David Pogue is very popular too. But some of the nasty stuff he writes which are factually wrong goes completely un-noticed.

But, but, but... (3, Insightful)

shmeck (583877) | more than 3 years ago | (#33277052)

...he must be right! He used math, and everything! I'm a little shocked that Kurzweil equates blueprints with the functioning organ. I am not shocked, however, that the tech media latched onto this--at first blush it sounds so *reasonable.*

Re:But, but, but... (1)

mcvos (645701) | more than 3 years ago | (#33277358)

What I thought when I read TFA was: maybe the brain actually can be described by a million lines of code. In the same way the Mandelbrot fractal can be described by a single formula.

A million lines of code wouldn't describe a working brain, it would describe the tools that, given enough time and food, would create a working brain.

Consider how Deep Thought couldn't calculate the Ultimate Question, but could describe a machine that would be able to calculate it.

IT press gasbags (4, Interesting)

benjfowler (239527) | more than 3 years ago | (#33277054)

Kurtzweil is just the latest in a long line. How do you think publications like Fast Company and Wired get written?

And does anybody remember JonKatz?

Re:IT press gasbags (1)

DNS-and-BIND (461968) | more than 3 years ago | (#33277196)

Oh GOD. I had totally forgotten about that guy until you said that. Thanks for dredging up painful memories. The guy is the reason that they implemented a killfile on Slashdot, due to the huge number of complaints. I wonder if he's still blocked on my account, I'm too lazy to check.

Brain? (0)

Anonymous Coward | more than 3 years ago | (#33277056)

What is brain?

worst article ever (0)

Anonymous Coward | more than 3 years ago | (#33277060)

This is a new low for /.! Who the hell is Ray, and what is the claim?

Re:worst article ever (0)

Anonymous Coward | more than 3 years ago | (#33277244)

This is a new low for /.! Who the hell is Ray, and what is the claim?

Agreed. Is it too much to ask to have a bit of content in the article summary? I realize that I can RTFA, but having some actual explanation of what the article is about might make me more interested in actually reading it. Isn't that the point of the summary after all?

Re:worst article ever (3, Informative)

TheCycoONE (913189) | more than 3 years ago | (#33277246)

I haven't read the article yet, but Ray Kurtzweil is a technology speculator - like a sci-fi writer except that he doesn't make up a story to go with his ideas and tries harder to convince people they're actually going to happen. He wrote "The age of intelligent machines" and "The age of spiritual machines" where he takes a hard AI stance that computer thought can become indistinguishable from human thought. He is also a proponent of technological singularity.

Generally his ideas aren't taken very seriously by academia in Computer Science, or at least that has been my experience. The philosophy department at my university sometimes enjoyed going over his ideas; but the philosophy department at my university was very fond of pseudoscience.

Re:worst article ever (1)

mcvos (645701) | more than 3 years ago | (#33277364)

Who the hell is Ray,

Some guy that's wrong.

and what is the claim?

Who cares? It's wrong anyway.

10 years?! (3, Funny)

Rik Sweeney (471717) | more than 3 years ago | (#33277068)

His latest claim is that we'll be able to reverse engineer the human brain within a decade

Amateur. I could put something together to simulate the human brain in about 8 months.

(Plus another 3 minutes at the start)

Re:10 years?! (4, Funny)

ColdWetDog (752185) | more than 3 years ago | (#33277132)

Amateur. I could put something together to simulate the human brain in about 8 months.

More like half an hour. It doesn't take Jello all that long to set up.

(Just finished an hour drive in Seattle - my current impression of the human brain isn't particularly complimentary.)

Re:10 years?! (0)

Anonymous Coward | more than 3 years ago | (#33277178)

Sorry, that doesn't count. You need to beta test it for a couple of decades.

Re:10 years?! (0)

Anonymous Coward | more than 3 years ago | (#33277404)

More like three seconds, I heard.

Re:10 years?! (1)

mcvos (645701) | more than 3 years ago | (#33277426)

I had my wife do the development work. (I was just there for the kick-off meeting.)

Don't underestimate the amount of time and work it takes to get the end result up and running properly, though. That could take years.

A biologist doesn't understand programming (2, Insightful)

mangu (126918) | more than 3 years ago | (#33277088)

FTFA: The end result is a brain that is much, much more than simply the sum of the nucleotides that encode a few thousand proteins.

Likewise, the end result of a computer is much, much more than simply the sum of the commands that encode a CPUs instruction set.

Re:A biologist doesn't understand programming (4, Insightful)

commodore64_love (1445365) | more than 3 years ago | (#33277220)

No not really.

A computer is a fixed system. If you tell it to do A (via software), you know you will get B, based upon knowledge of how the circuits are hardwired. The same can not be said of the human brain, because it has the ability to change its hardware (via growing new connections between neurons).

Shurely Shome Mishtake? (0)

Anonymous Coward | more than 3 years ago | (#33277350)

No not really.

A computer is a fixed system. If you tell it to do A (via software), you know you will get B

You clearly don't work where I do...

AC... for a reason!!!

Re:A biologist doesn't understand programming (3, Interesting)

nofx_3 (40519) | more than 3 years ago | (#33277430)

Obviously you've never heard of FPGA: http://en.wikipedia.org/wiki/Field-programmable_gate_array [wikipedia.org] While you can't add new connections in the strictest sense, you could could conceivably create a chip with a whole bunch of generic unused hardware and in the rest of the hardware program an algorithm that allows new connections to be made with that raw material.

Re:A biologist doesn't understand programming (3, Insightful)

raddan (519638) | more than 3 years ago | (#33277290)

You're being conveniently trite here, though. That's not a good counter-argument. This particular biologist seems to have a pretty good grasp on the fundamental problem with Kurzweil's argument, and that problem is: Kurzweil confuses the purpose of the genome. It is not "the program"! Myers contends that, really, it's more like data. To me, this sounds like a classic Von Neumann architecture: it's bit of both, depending on your context. In any case, Kurzweil completely misses out on the fact (and he would know this if he had followed *anything* in genomics over the last 15 years) that the genome, as encoded in DNA, is only a small part of what makes a cell express and function in a particular way. A nice introduction to the epigenome was in this NOVA documentary [pbs.org] .

Re:A biologist doesn't understand programming (1)

mangu (126918) | more than 3 years ago | (#33277504)

This particular biologist seems to have a pretty good grasp on the fundamental problem with Kurzweil's argument, and that problem is: Kurzweil confuses the purpose of the genome

As I already mentioned in another post [slashdot.org] , I think PZ Myers distorted or misunderstood Kurzweil's ideas.

A better analogy would be to say that the genome is akin to the VHDL [wikipedia.org] program used to design the CPU. The VHDL code needed to fully describe a computer's electronics is typically much smaller than the code used in that computer's programs.

Re:A biologist doesn't understand programming (1)

wiredlogic (135348) | more than 3 years ago | (#33277344)

The brain/mind is an emergent system that develops from a multitude of stochastic processes. Nobody really knows how it works. Computers are designed to be deterministic and only deviate under faulty conditions caused by human error or design flaws.

Ignorance (0)

Anonymous Coward | more than 3 years ago | (#33277110)

"Reverse-engineering the brain is being pursued in different ways," says Kurzweil. "The objective is not necessarily to build a grand simulation - the real objective is to understand the principle of operation of the brain."

You could just read the article, and understand that this isn't designed to create a human brain from a digital base, it's to understand the operation of an already developed brain.

Bad compsci (4, Insightful)

bluefoxlucid (723572) | more than 3 years ago | (#33277118)

Sejnowski says he agrees with Kurzweil's assessment that about a million lines of code may be enough to simulate the human brain.

Here's how that math works, Kurzweil explains: The design of the brain is in the genome. The human genome has three billion base pairs or six billion bits, which is about 800 million bytes before compression, he says. Eliminating redundancies and applying loss-less compression, that information can be compressed into about 50 million bytes, according to Kurzweil.

About half of that is the brain, which comes down to 25 million bytes, or a million lines of code.

Idiot. The design of the brain is encoded in the genome in the same way that the design of a 4KiB program is encoded in its load module: useful for running the program on its original hardware.

But then you have architectural issues. That 4KiB of information does not run unless it's supported by a complex operating system, which itself is supported by complex logic in an ISA and memory managment architecture backing it up. And all that is implemented on a specific design in a specific physics model.

Translating that program to SPARC takes work, and it comes out roughly the same size. Translating that program to a progression of chemical reactions produces something vastly different, especially since you need a new middle ware (chemical environment) running on top of different physics (chemistry).

Translating a physical architectural design from chemistry to computer logic on top a given ISA is the same problem. You now have odd issues that are messy, and then the program running on the brain needs to be built again. That program is even more complex and less known.

Re:Bad compsci (0)

Anonymous Coward | more than 3 years ago | (#33277330)

I guess what he's saying is that the average quantity of "pure" (uncompressible) information is 50 MBytes.
No matter how complicated are the steps to build a brain from these 50 MBytes, there's no process that's adding extra information (just like the entire complexity of the mandelbrot set is entirely coded in its simple recurrence formula).

Re:Bad compsci (0)

Anonymous Coward | more than 3 years ago | (#33277464)

well, it is true that the decoder itself has to be taken into account, the line between what constitutes the code and what is the decoder is arbitrary after all...

Whaaa... (2, Insightful)

al3k (1638719) | more than 3 years ago | (#33277128)

About half of that is the brain, which comes down to 25 million bytes, or a million lines of code.

I'm not even sure what to say about this statement

The design of the brain is in the genome (0)

Anonymous Coward | more than 3 years ago | (#33277146)

I agree that this statement is wrong. The genome means less than ever. The idea that sequencing a genome is the endpoint of understanding has less power than ever.

These "genomes" don't really exist in nature. In nature sex cells come together and are "reprogrammed" by methylation and other mechanisms. Each cell line is reconfigured and changes throughout cell development and interaction with its environment.

The action is in functional genomics and epigenetics. You wont find it in the mainline code of some dead cell.

The design of the brain may bootstrap from the genome, but the design of brain is in the runtime environment, which could be infinitively more complicated than the genome.

Junk DNA anyone?
     

Man (2, Funny)

Pojut (1027544) | more than 3 years ago | (#33277154)

I wish I could get a job as a futurist....think about it:

"What do you think is going to happen in the future???"
"Um...dogs will bring soda to you when you whistle a Cradle of Filth song?"
"OMFG THATZ BRILLIANT. HERE R MONIES, PLZ HAS MAH BABBIES!"

Re:Man (1)

Tablizer (95088) | more than 3 years ago | (#33277270)

1. Troll on AI
2. Hide the "???" from press
3. Profit!

Kurzweil is AI.. and somewhat buggy (1)

ahodgkinson (662233) | more than 3 years ago | (#33277164)

Ray Kurzweil has been making claims for AI for years. For example that we will have an AI singularity event and that
society will be completely replaced my machines. Well, decades later it still hasn't happened and the only things in the
field of computer science that seems to have a life of its own are spam and computer viruses. I'd like call them a
life form.

Will we reverse engineer the brain any time soon? I doubt it. Part of the reason is practical. This would be an
extremely expensive and time consuming undertaking. I'm not sure its even worth it, especially when this is
compared to other branches of science which have made rapid advances. For one example, take a look at
the field neuro-science and its use of fMRI scanning.

Reversing engineering the brain, probably is possible, but it's probably not worth it right now. Well have
to wait another decade.. again.

Re:Kurzweil is AI.. and somewhat buggy (1)

raddan (519638) | more than 3 years ago | (#33277416)

I think Kurzweil is largely a nutjob, but I disagree with your assessment in the value of simulating the brain. Nearly anyone involved in understanding the brain, from chemistry, to biology, to psychiatry would benefit from having an accurate brain simulator. It will absolutely be time-consuming. It will absolutely be expensive. It will probably not happen within the lifetimes of us or our children or our grandchilden. But scientists build models and validate them; that's simulation. If the brain really does fit within our model of the universe, we should be able to simulate it, and we should be able to validate it against the real thing. Science itself will always drive us toward simulating it, as a method of understanding it. Maybe AI and brain research become the same thing at some point in the future?

As for the "Singularity"... I don't know, and, unless you write sci-fi, I don't really care, because it's a distant possibility.

What Kruzweil Doesn't Know (1)

colmore (56499) | more than 3 years ago | (#33277166)

What Kurzweil doesn't understand could fill a book.

Several [amazon.com] in fact.

Re:What Kruzweil Doesn't Know (0)

Anonymous Coward | more than 3 years ago | (#33277304)

Wouldn't it be more appropriate to say the things he thinks he understands, but doesn't?

Might not be wrong even if the resoning is (0)

Anonymous Coward | more than 3 years ago | (#33277192)

The author is almost certainly correct that encoding the genome structure will be insufficient to produce a simulated brain, but it is also very possible that neural computation is built on a small set of principles that could be duplicated in a million lines of code. Of course, first we'll have to figure out what those principles are. It might happen in the next ten years, but considering how much time and effort (close to 50 years) has gone into understanding the low-level visual system and how little we still know about its function I wouldn't bet on it.

Laughable (3, Insightful)

brain1 (699194) | more than 3 years ago | (#33277206)

Let's see. On another recent article it was stated that the average car has several million lines of code running in it. I haven't come across a sentient Prius yet.

And there's that pesky parallel processing the brain does. I don't think that a rack full of Nvidia Tesla cards can approach the average two year old's parallel processing capability.

I agree, Kurzweil is smoking something and not sharing.

Re:Laughable (0)

Anonymous Coward | more than 3 years ago | (#33277228)

Considering the Prius wasn't designed to be sentient, it's no wonder you haven't come across one yet.

And the computing power capable of emulating the human brain isn't a rack of nVidia Tesla cards, it's a computer with requirements about 20 times that of all the PS3s in the world connected together (roughly 10^15 petaflops).

Re:Laughable (1)

Dalzhim (1588707) | more than 3 years ago | (#33277448)

The brain isn't all knowledgeable either. Several million lines of code could very well be enough to design the entire universe and more depending on the extent to which the primitives of the programming language you're using are advanced and capable.

What has the brain been programmed for anyway? Learning algorithms which make you able to learn stuff and fill in the gaps that aren't programmed. Reflexes (automated movements that bypass any thinking for speed purpose). Memory management (conscious and subconscious). Well that's about it. The rest you can figure out by yourself using the learning algorithms you have built-in and in a few years maybe you'll know how to talk and walk.

That's not reverse engineering (1)

Dalzhim (1588707) | more than 3 years ago | (#33277218)

Ray Kurzweil never says we'll be able to write software to emulate the brain completely in ten years. He just says we'll have reverse engineered the signals we have to send to the cortex to interact with it. Reverse engineering involves analysing and understanding something. Reengineering anew would be the next step. But then again, when you're only interested in reverse engineering the various inputs you can give to the brain to get different results, you're still very far away from understanding how to build a brain.

PZ Myers clearly doesn't understand reverse engineering and writes up a useless article based on his erroneous comprehension of Ray Kurzweil's prediction.

50 Megabytes is WAY too much ! (5, Funny)

mbone (558574) | more than 3 years ago | (#33277248)

Sejnowski says he agrees with Kurzweil's assessment that about a million lines of code may be enough to simulate the human brain.

Kurzweil explains: The design of the brain is in the genome. The human genome has three billion base pairs or six billion bits, which is about 800 million bytes before compression, he says.Eliminating redundancies and applying loss-less compression, that information can be compressed into about 50 million bytes, according to Kurzweil.

Dude, the equations of quantum mechanics can be written on one page. General Relativity can be written on a second page. What more do you need ? Clearly, a few hundred lines of code (and a few do loops) should be enough to simulate the entire universe, brains and all.

Glad we cleared that up. All you physicists and astronomers can go home now and work on your resumes.

surely not everything (0)

Anonymous Coward | more than 3 years ago | (#33277272)

"To simplify it so a computer science guy can get it, Kurzweil has everything completely wrong" -- Too strong a statement. I've read some of Kurzweil's books, and while they may be a little bit on the optimistic side, they are quite imaginative and thought provoking. Considering that no one as of yet understands the brain, or has more than an inkling of an idea how mind and consciousness emerge, it is a little dramatic to proclaim his having 'everything wrong.' That there is a gigantic amount of redundancy in the design of the brain is true, and interesting, and a valid place to start when considering how one might go about inventing one, real or virtual. We'll see in ten years, now won't we?

Was Jesus a NIGGER? (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#33277282)

Was Jesus a NIGGER?

Many people are surprised to discover that Christ was a black man, but when one looks at Christ's lineage one discovers that He has numerous Hamatic Ancestors, with Tamar, Rahab, Ruth, Naomi, Bathsheba and Jezabel being the most notable.

Here are the facts:

In ancient times, including Jesus' time, the Arabian peninsula was considered part of what we now call Africa, not "the Near East" or "the Middle East".

Christianity is frequently portrayed as "the White Man's religion". The truth is that most of the people in the Bible were people of color (i.e., not "Anglo"): Semitics, blacks, and Mediterranean, e.g., Romans.

In the United States today the general view on whether someone is "black" is the One-Drop Rule -- if a person has any black ancestors s/he is considered "black", even with a clearly Anglo skin color, e.g., Mariah Carry, Vanessa L. Williams, LaToya Jackson.

Jesus' male ancestors trace a line from Shem. However, ethnically and racially,
they were mixed Semitic and Hamitic from the times spent in captivity in Egypt
and Babylon. Rahab and probably Tamar were Canaanites. Although Canaanites
spoke a Semitic language, they were descendants of Ham through his son Canaan.
Bethsheba, who had been the wife of Uriah the Hittite, probably was a Hamitic
(black) Hittite herself.

GNAA (GAY NIGGER ASSOCIATION OF AMERICA) is the first organization which

gathers GAY NIGGERS from all over America and abroad for one common goal - being GAY NIGGERS.

Are you GAY?

Are you a NIGGER?

Are you a GAY NIGGER?

If you answered "Yes" to any of the above questions, then GNAA (GAY NIGGER
ASSOCIATION OF AMERICA) might be exactly what you've been looking for!

Join GNAA (GAY NIGGER ASSOCIATION OF AMERICA) today, and enjoy all the benefits of being a full-time GNAA member.

GNAA (GAY NIGGER ASSOCIATION OF AMERICA) is the fastest-growing GAY NIGGER community with THOUSANDS of members
all over United States of America. You, too, can be a part of GNAA if you join today!

Why not? It's quick and easy - only 3 simple steps!
First, you have to obtain a copy of GAY NIGGERS FROM OUTER SPACE THE MOVIE (Click Here to download the ~280MB MPEG off of BitTorrent)

Second, you need to succeed in posting a GNAA "first post" on slashdot.org, a popular "news for trolls" website

Third, you need to join the official GNAA irc channel #GNAA on EFNet, and apply for membership.

Talk to one of the ops or any of the other members in the channel to sign up today!

If you are having trouble locating #GNAA, the official GAY NIGGER ASSOCIATION OF AMERICA irc channel, you might be on a wrong irc network. The correct network is EFNet,
and you can connect to irc.secsup.org or irc.easynews.com as one of the EFNet servers.

If you do not have an IRC client handy, you are free to use the GNAA Java IRC client by clicking here.

If you have mod points and would like to support GNAA, please moderate this post up.

This post brought to you by RKZ,
a proud member of the GNAA

I am protesting Slashdot's chronic abuse of its readers and subscribers. Please visit www.anti-slash.org and help us!

um, dolor. Nunc nec nisl. Phasellus blandit tempor augue. Donec arcu orci, adipiscing ac, interdum a, tempus nec, enim. Phasellus placerat iaculis orci. Crasa sit amet quam. Sed enim quam, porta quis, aliquet quis, hendrerit ut, sem. Etiam felis tellus, suscipit et, consequat quis, pharetra sit amet, nisl. Aenean arcu massa, lacinia in, dictum eu, pulvinar ac, orci. Mauris at diam tempor ante ullamcorper molestie. Ut dapibus eleifend ipsum. Nam dignissim.

Kurzweil is interesting. . . (1)

RazorSharp (1418697) | more than 3 years ago | (#33277286)

but I've always had the impression that his 'philosophy' is inspired by Star Trek rather than the scientific method or logic. His AI claims always seem to be a vast underestimation of the complexity of the human mind. "Wishful thinking" is the best phrase to describe his ideas. Regardless, the summary for the article could have been an actual summary rather than just a copy-paste of the first couple lines.

Anonymous Coward (0)

Anonymous Coward | more than 3 years ago | (#33277298)

Kurzweil may be off on the AI side but biology may someday create a singularity in recreating intelligence.
Gene Networks are now being used to model cancer and cell communications are critical to cancer growth.

Dr. Judah Folkman pioneered this work in the 80’s.

http://www.amazon.com/Dr-Folkmans-War-Angiogenesis-Struggle/dp/0375502440

Craig Venter used ocean genomes to show viral lateral transfers evolve DNA.

http://scratch.mit.edu/projects/GeneMachine/51835
http://scratch.mit.edu/projects/GeneMachine/50308

A network analyzer may be able to model gene protocols used by cancer.

Cancer reverts to the primitive LTR messaging of viruses in junk DNA.

http://www.sciencedaily.com/releases/2010/05/100502173845.htm

  Network protocol analyzer – Wire shark
http://www.wireshark.org/

Using the Cancer Databases may allow a Wireshark for Cancer to be created.
Database for Personalized Cancer Treatment - Sanger
http://www.sciencedaily.com/releases/2010/07/100715152901.htm

Each cancer drug would create a map of gene expression and gene network protocols.
Using comparative genomics with other organisms a multi species gene map may be possible.
Using Goalie a network model will allow scientists to understand gene expression and cell communications.
Goalie = Wireshark (ethereal) for Cells – Future Cancer Diagnostic –
Gene cluster analysis
http://bioinformatics.nyu.edu/~marcoxa/work/GOALIE/

Students could then use http://processing.org to build gene models of complex cancer systems.
The real future is not AI it is in reducing health care costs through molecular biology.
In understanding the biology we may someday have a clue of how the brain works and how AI works.
AI will require a context switching machine similar to DNA. All present silicon solutions require a programmer and if statements.
Viruses build genomes with no programming but with shareware.
Each drop of seawater has a million viruses and 1 thousand bacteria.
The ocean is a genetic computer.
http://scratch.mit.edu/projects/GeneMachine/51835

By building systems biology and gene networks todays students may one day understand cross species protocols and common gene languages.

    Genomic Test of Tumor DNA

    http://www.genomichealth.com/OncotypeDX/Index.aspx

    Excellent Breast Cancer Tutorial for students

    http://gcat.davidson.edu/Pirelli/index.htm

Ignorance (1)

Teslonomikon (1880778) | more than 3 years ago | (#33277302)

If you read the article closely, the engineering is designed to understand operation, not create a grand simulation. You can all quit toting your brainless responses now.

The Devil's In The Details (1)

hondo77 (324058) | more than 3 years ago | (#33277322)

...Kurzweil's assessment that about a million lines of code may be enough to simulate the human brain.

If you accept that, then the real problem to solve becomes: in what language do you write the code?

;-)

Only people dumber than Kurzweil... (0)

Anonymous Coward | more than 3 years ago | (#33277328)

...are slashdot editors. There isn't even a pretense of a story here. I've seen more informative Twitter summaries.

PZ Myers does not understand computers ... (4, Interesting)

Lazy Jones (8403) | more than 3 years ago | (#33277334)

Kurzweil seems to understand the basics of Algorithmic Information Theory [wikipedia.org] , whether by intuition or study, I can't tell. What I can tell is that PZ Myers has problems comprehending the interaction of code and data (hint: the history of billions of cells is data) and the fact that seen from outside the field of highly specialized machines for processing of digital information, 8 bytes of code can seem to do an extremely complex piece of work to their environment, just like small proteins observed from outside their "working environment". When we model the brain successfully, we will probably not do it by simulating proteins and their environment, we will simply simulate the input/output, i.e. on a higher level than what gets PZ, who wants to plug proteins into computers, so aroused.

To simplify it so a computer science ignorant biologist with a tendency to inane rants can possibly get it, you don't need to simulate electrons in a semi-conductive material at specific temperatures in order to build a complete working emulator for an old computer.

Sigh (1, Insightful)

Co0Ps (1539395) | more than 3 years ago | (#33277336)

Eliminating redundancies and applying loss-less compression, that information can be compressed into about 50 million bytes

This is so retarded that it's sad. Why is it so hard to understand how compression algorithms works? Saying that X can be compressed into Y bytes doesn't say ANYTHING. You can "loose-less compress" ANYTHING into 1 bit by using the function that takes that and returns the bit "1" (and which takes anything else and returns "0" + that). What does compression has to do with anything? The stupid hurts...

No one understands The Brain... (3, Funny)

Chris Tucker (302549) | more than 3 years ago | (#33277340)

...except, maybe, Pinky.

Compression Algorithm (0)

dorkinson (1615103) | more than 3 years ago | (#33277352)

Given that we only use 10% of our brains (less, for women), I would think we could compress the code even further! We'd probably want to optimize the think_about_sex subroutine too.

Nobody does. (0)

Anonymous Coward | more than 3 years ago | (#33277382)

The human brain is far more complex than all of the technology created by man put together.

understanding the brain (1)

gordona (121157) | more than 3 years ago | (#33277386)

"If the brain were simple enough to be understood, it would be too simple to understand itself"--anonymous

If all of our folly were turned to intelligence and divided amongst a thousand toads, each would be more intelligent than Aristotle

R'ing TFA? Heresy! (4, Funny)

shish (588640) | more than 3 years ago | (#33277456)

Scanning down the comment list, it looks like every (+2 or more) comment has read the article and is quoting from it -- what has happened to the slashdot I knew and loved?

He may not understand how the brain functions, (1)

Capt James McCarthy (860294) | more than 3 years ago | (#33277466)

But he understands the psychology behind selling vaporware.

The brain uses quantum effects (0)

Anonymous Coward | more than 3 years ago | (#33277476)

How do we calculate the complexity of the human brain? It's not by counting neurons (about 100 billion) or synapses (the connections between neurons, about 100 trillion). Individual receptors play a role in information processing (there are no good estimates of how many receptors there are in the brain, but it's at least an order of magnitude greater than the number of synapses). There is now some evidence that receptors use quantum effects to perform information processing. That means that in order to duplicate the brain we may need a quantum computer with quadrillions of qubits. That's not going to happen in the next 10 years.

He's right actually (1)

AC-x (735297) | more than 3 years ago | (#33277516)

The human brain could be simulated with a million lines of code, however it would need to be run inside an emulator that simulates physics perfectly and has enough RAM to store the quantum state of the aprox 1.5*10^26 atoms that make up a human brain.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...