Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Our Brains Don't Work Like Computers

samzenpus posted more than 9 years ago | from the I'm-not-a-machine dept.

Science 737

Roland Piquepaille writes "We're using computers for so long now that I guess that many of you think that our brains are working like clusters of computers. Like them, we can do several things 'simultaneously' with our 'processors.' But each of these processors, in our brain or in a cluster of computers, is supposed to act sequentially. Not so fast! According to a new study from Cornell University, this is not true, and our mental processing is continuous. By tracking mouse movements of students working with their computers, the researchers found that our learning process was similar to other biological organisms: we're not learning through a series of 0's and 1's. Instead, our brain is cascading through shades of grey."

cancel ×

737 comments

Sorry! There are no comments related to the filter you selected.

Frist post! (-1, Troll)

Anonymous Coward | more than 9 years ago | (#12946655)

Slaytanic for gay nigger, yeah!!!

really?!? (-1, Flamebait)

jrl87 (669651) | more than 9 years ago | (#12946658)

I never would have thought. Now, how much did this cost and how much did the "researchers" make?

Re:really?!? (3, Insightful)

SamQ (896234) | more than 9 years ago | (#12946724)

I presume the info was a byproduct of a useful study (Cog-Neuro-Psy possibly?). I really hate it when the media picks out the And finally bit of science news stories (a la bread-landing-on-the-buttered-side, etc).

Re:really?!? (2, Insightful)

Breakfast Pants (323698) | more than 9 years ago | (#12946895)

Are you saying that Roland would have pointed us to a somewhat useless article?!?? Piquepaille wouldn't do such a thing! Oh wait, he has for his last 80 damn stories.

Re:really?!? (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#12946787)

I think the real problem is each time you push for more improvements, the more complex the architecture gets. The article said that most developers would be using only one of the PS3's processors for most operations. Well, when you're used to designing for one processor, you tend to continue designing for one processor.

Each new feature added to the console requires learning that developers for past consoles, who have been used to the last console, will do slowly, and maybe reluctantly.

What developers really want is the *exact same* architecture, but much faster, more memory, etc. No more processors, no more complex ways of addressing different caches. Just make the thing the same, only faster, and developers would love it. Initially...

However, a year from now, the developers will learn the basics of the new consoles, and want something more. Then they will get into all those features that the new architecture gives them, and be excited to be the first to make a game that has realistic crumbling concrete when the tank slams into a wall, or whatever else they decide to do.

But asking a developer now about how their next gen console devkit performs is premature.

Re:really?!? (1, Funny)

Anonymous Coward | more than 9 years ago | (#12946864)

how the fuck do you reply to the wrong damn article

Re:really?!? (-1, Flamebait)

Anonymous Coward | more than 9 years ago | (#12946890)

YHBT

Have a cookie, brightboy.

Re:really?!? (4, Insightful)

CaymanIslandCarpedie (868408) | more than 9 years ago | (#12946808)

Thank god we have someone like Roland Piquepaille to point out these amazing facts to us!

Yes, that was sarcasam!

Re:really?!? (0)

Anonymous Coward | more than 9 years ago | (#12946877)

Dude... calm down. All he did was link to a collegiate web site. He's not pimping his "blog," he's not getting ad revenue, and he's not just posting fluff for the helluvit. I'm all for jumping down his throat when stuff like that happens, but it didn't this time.

Re:really?!? (1)

flyneye (84093) | more than 9 years ago | (#12946814)

moreso,who paid?
If it were me,I'd like an audit.
More garbage,from those who can't think of anything better to do to rationalize the existence of their dept. than to make half baked claims and call them studies.Anything for a buck from the taxpayer I guess.
Didn't they see the mr.wizard episode about neurotransmitters and receptors?on and off states = 1s and 0s.
bad professor,go stand down.

Yes they do (0, Offtopic)

Michael_Munks (869444) | more than 9 years ago | (#12946659)

damnit

Error! (0, Funny)

Anonymous Coward | more than 9 years ago | (#12946660)

Does not compute!

first post! (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#12946662)

yay for first post!!!!!

comparisons (5, Insightful)

sound+vision (884283) | more than 9 years ago | (#12946663)

And it is for this reason that I loathe comparisons of computing power to brain power. "By 2015, we'll have computers as smart as humans." What kind of bullshit comparison is that? They're two completely different processes.

Re:comparisons (5, Insightful)

NoImNotNineVolt (832851) | more than 9 years ago | (#12946760)

"By 2015, we'll have computers sufficiently powerful to simulate a full working model of a human brain in enough detail to be functionally equivalent" would be what is actually being predicted. Because we have no convenient way of quantifying human smarts, like you said we cannot effectively compare how "smart" a computer is with respect to a human. That doesn't mean that computers will not be able to be functionally equivalent to biological intelligences, and there's no logical reason to suspect that they won't be in due time.

Re:comparisons (1)

Hogwash McFly (678207) | more than 9 years ago | (#12946778)

And by 2007, we'll have computers as smart as your average Slashdotter. What's that? You say I'm typing on one already? Well I never!

Fuzzy Networks (2, Insightful)

Sir Pallas (696783) | more than 9 years ago | (#12946664)

That's what I heard. Even if they don't work like sequential or even parallel digital computers, I'm pretty sure that brains still compute. Mine tries, at least.

One word... (-1)

Anonymous Coward | more than 9 years ago | (#12946665)

D'Oh!

No, I didn't think that (0, Flamebait)

Gregg Alan (8487) | more than 9 years ago | (#12946668)

I didn't think our brains were binary. I thought that part of the difficulty in reproducing a mechanical brain was preciously it's shades of grey.

Granted, I'm somewhere over 30. Are younger people that dumb nowadays?

Re:No, I didn't think that (0, Offtopic)

Gregg Alan (8487) | more than 9 years ago | (#12946697)

brain was preciously it's

Dumbass

Re:No, I didn't think that (0, Offtopic)

Tim Browse (9263) | more than 9 years ago | (#12946745)

No apostrophe needed in it's either.

He's a super dumbass!

Re:No, I didn't think that (0, Offtopic)

Gregg Alan (8487) | more than 9 years ago | (#12946786)

Its the ale!

(apostrophe omitted for extra humor)

I hope not... (3, Funny)

Trinition (114758) | more than 9 years ago | (#12946730)

Are younger people that dumb nowadays?

I hope not, because if they are, I must finally be old.

Re: No, I didn't think that (2, Funny)

Black Parrot (19622) | more than 9 years ago | (#12946739)


> I thought that part of the difficulty in reproducing a mechanical brain was preciously it's shades of grey.

It's even made of grey matter.

Re:No, I didn't think that (1)

bestguruever (666273) | more than 9 years ago | (#12946747)

Yes they are. I think it has to do with the influence from all the binary devices they interact with causing them to never develop more than two shades.

I'm 27 and I must be somewhere in the middle because I don't seem to have enough shades to grok the use of the word "preciously" in your post.

Re:No, I didn't think that (0)

Anonymous Coward | more than 9 years ago | (#12946823)

Are younger people that dumb nowadays?

Yes.

-1, Roland Piquepaille (3, Insightful)

QuantumG (50515) | more than 9 years ago | (#12946670)

Fuck off.

Re:-1, Roland Piquepaille (3, Insightful)

backslashdot (95548) | more than 9 years ago | (#12946756)

I agree .. at the cost of a negative mod .. I will say that this guy Roland is a real jerk.

wonder if he's giving kickbacks to samzenpus for posting his stuff.

Re:-1, Roland Piquepaille (4, Insightful)

rpozz (249652) | more than 9 years ago | (#12946785)

I'll burn some karma too. In this article he hasn't posted a link to his plaguerised 'overview'. Is this a poor attempt to make it look like that no money changes hands between him and slashdot?

Re:-1, Roland Piquepaille (0)

Anonymous Coward | more than 9 years ago | (#12946825)

This isn't k5, you know.

I am so smart (-1)

Anonymous Coward | more than 9 years ago | (#12946672)

S M R T

d'oh!

Hmm... (0, Redundant)

aldatur (893868) | more than 9 years ago | (#12946673)

"...our learning process was similar to other biological organisms..."
AMAZING! Who would have made that sort of connection?!

Re:Hmm... (2, Insightful)

binary paladin (684759) | more than 9 years ago | (#12946691)

Yeah. That was pretty much my reaction. Seriously, I think the submitter has been in front of his computer too much.

Re:Hmm... (0)

Anonymous Coward | more than 9 years ago | (#12946727)

Of course, check out who is the submitter.

It makes sense now.

HA! (2, Funny)

countach44 (790998) | more than 9 years ago | (#12946676)

Take that skynet!

Thank God (-1)

Anonymous Coward | more than 9 years ago | (#12946677)

Thank God our brains don't work like Roland Piquepaille's either !!

Wow! (-1, Flamebait)

Anonymous Coward | more than 9 years ago | (#12946679)

Next thing you know someone is going to do a study to determine that the sky is actually blue!!!

Re:Wow! (0)

Anonymous Coward | more than 9 years ago | (#12946885)

But the sky ISN'T Blue in reality
its just that under most conditions the color
(resulting from angle of the sun and some rather nasty physics/ chemestry) is considered "blue".

something's missing (4, Funny)

justforaday (560408) | more than 9 years ago | (#12946681)

Looks like the submitter forgot something. Lemme see if I can help him out a little:

How will this study affect your next thought? Go here [primidi.com] to discuss it further.

There, that feels more complete.

Re:something's missing (3, Funny)

Anonymous Coward | more than 9 years ago | (#12946762)


S***** Network Administration

site: primidi.com
classification: spam/advertising
access: denied

If you think this is an error please contact ***@**

Re:something's missing (1)

sik0fewl (561285) | more than 9 years ago | (#12946914)

Thanks. I was wondering what the hell was going on (and still am). When I could only find the link directly to the document the blurb was talking about I scanned over the blurb a couple more times looking for the primidi.com link.

I wasn't until later that I actually noticed that Roland's name itself was the only link to primidi.com in the entire blurb. Weird, eh? Maybe he's trying to to get on our good sides.

Next up (0)

Anonymous Coward | more than 9 years ago | (#12946688)

Apples and oranges grow on different trees

The Network is the Computer (5, Informative)

Doc Ruby (173196) | more than 9 years ago | (#12946689)

Each neuron is like a tiny, slow analog DSP, feeding back FM around a base frequency (eg. about 40Hz in the brain's neural tract). The neurons have feedback among themselves locally, and send out some larger feedback in fiber bundles, signalling other clusters along the way. It's like a teeming kazoo symphony, without a conductor.

Re:The Network is the Computer (4, Informative)

SilentChris (452960) | more than 9 years ago | (#12946884)

Well, actually, from the article it sort of sounds like a multibranch computing article I read a while back. I'm not sure if Intel actually went through with this, but the idea was to have a CPU process multiple "paths" ahead of time.

So, for example, for a simple if statement waiting on user input, part of the CPU would process the "true" result of the statement and part would process the "false" one. When the user made a decision, one would be used and one would be thrown out. In theory, computing these branches ahead of time was supposed to be faster than doing things linearly.

Again, though, I'm not sure Intel went through with this. They were the subject of the article.

IBM (1)

savagedome (742194) | more than 9 years ago | (#12946690)

IBM has been working on this [slashdot.org] for a while

Obvious (1, Funny)

Bloater (12932) | more than 9 years ago | (#12946692)

In other news, the sky is blue...

Come on, it's not like this is neuroscience... Oh.

Computers can process "shades of gray" (3, Insightful)

Anonymous Coward | more than 9 years ago | (#12946693)

...with floating point arithmetic. A "double" can represent a number between 0 and 1 with 15 decimals of precision, way more precise than any biological phenomenon. Computers can think like us, it's just a matter of writing the right floating-point code.

Re:Computers can process "shades of gray" (2, Funny)

Bloater (12932) | more than 9 years ago | (#12946729)

Don't forget to gather entropy from meatspace.

Re:Computers can process "shades of gray" (1)

orkysoft (93727) | more than 9 years ago | (#12946799)

Actually, even floating point is overkill here. But I applaud your correct use of the word "phenomenon" here.

Re:Computers can process "shades of gray" (0, Flamebait)

FLAGGR (800770) | more than 9 years ago | (#12946835)

ugh. That floating point number is still made up of individual bits, and in the case of 32bit x86, a double is 64bits, and each of those bits is still a binary 1 or 0. Jeez. You know it is possible to have decimals with fixed point arithmatic too? The only difference is you have to shift the decimal place yourself (and er well sign changes are different, but we'll forget that stuff for now) so I have no idea what point your trying to make. A bit is either one or zero, no matter what. Even if that bit happens to be part of a floating point number. Jeez.

Fascinating (5, Funny)

Vengeance (46019) | more than 9 years ago | (#12946694)

The idea that our brains might work like biological organisms is a real breakthrough.

Next week's research topic: Do farts stink?

Re:Fascinating (3, Insightful)

QuantumG (50515) | more than 9 years ago | (#12946709)

It's Roland Piquepaille, what did you expect, he's a fucktard and the only reason he's on Slashdot so much is that he has a business relationship with them.

Re:Fascinating (1)

youknowmewell (754551) | more than 9 years ago | (#12946720)

Not my mom's...no sir, her's smell like a bed of roses.

Re:Fascinating (0)

Anonymous Coward | more than 9 years ago | (#12946777)

Indeed. Could this open some eyes and increase interest in alternative (Linux, Mac) offerings?

The real world is Analog. (1)

TerranFury (726743) | more than 9 years ago | (#12946705)

Is this news?

What's interesting is the possibility of modeling electronics around this fact. Analog electronics may see a resurgence, and we lose the ability to squeeze more clock cycles out of our digital systems each second.

No shit? (0)

Anonymous Coward | more than 9 years ago | (#12946707)

This isn't even counter-intuitive let alone revelatory.

Missing Comma (5, Funny)

Doc Ruby (173196) | more than 9 years ago | (#12946708)

More like:

Our Brains Don't Work, Like Computers

So basically what this is saying... (2)

windows (452268) | more than 9 years ago | (#12946712)

...is that our brains (like TVs) are inferior analog devices and human brains need to be replaced with new digital versions. :-)

Re:So basically what this is saying... (1)

feepness (543479) | more than 9 years ago | (#12946912)

...is that our brains (like TVs) are inferior analog devices and human brains need to be replaced with new digital versions. :-)

Could I be programmed not to know the difference?

Yep (2, Insightful)

jrivar59 (146428) | more than 9 years ago | (#12946713)

"... Instead, our brain is cascading through shades of grey."


I guess some brains just have more contrast then others...

Like other animals? (1)

Black Parrot (19622) | more than 9 years ago | (#12946715)


Gee, that's a surprize.

Roland Piquepaille is like Leroy Jenkins... (0)

Anonymous Coward | more than 9 years ago | (#12946716)

He's an outcast of the /. community and he jumps to conclusions.

Roooooolaaaaaaaaaaaaaaaaand Piquepaille!

Wow (5, Funny)

CardiganKiller (854899) | more than 9 years ago | (#12946717)

I've been waiting for a scientist to tell me that I'm capable of thinking in abstract and fuzzy terms for years. Things I can now forget thanks to the brilliant scientist:

1.) The GPS coordinates of each key on my keyboard.
2.) The streaming audio of my name and all of my friends and families name.
3.) The bio-mechanical force sequences for the hundreds of muscles used in picking up a glass every morning.

Beer will no longer render my circuits useless!

Newsflash (4, Informative)

tupshin (5777) | more than 9 years ago | (#12946721)

Headline: Brains More Like Neural Nets Than Traditional Programs

Who woulda thunk it.

ftp://ftp.sas.com/pub/neural/FAQ.html%23A2 [sas.com]

'Most NNs have some sort of "training" rule whereby the weights of connections are adjusted on the basis of data.'

Insert joke about the 1980's (or 60's/50's/40's) calling). Somehow I don't think Norbert Weiner would be the slightest bit surprised.

-Tupshin

Moderators, please attend the parent post. (1)

Paradox (13555) | more than 9 years ago | (#12946759)

The first thing I thought when I read the post and skimmed the article was, "Well duh. What did you think neural nets were?"

I guess it's good to prove it, though.

We are borg... (2, Funny)

StimpyPimp (821985) | more than 9 years ago | (#12946722)

Maybe one day I will have an amd cluster in my skull. Until then, I will accept my alcohol-cooled brain.

huh? (1, Informative)

guardiangod (880192) | more than 9 years ago | (#12946723)

Like them, we can do several things 'simultaneously' with our 'processors.'

How so? Last time I checked 'computer brain' (cpu) cannot do multiple operations at the same time, unless you have dual core/cpus.

CPU just switch from one task to the other at break neck speed (yes I am ignoring pipelines and branch prediction - they are only use in streamlining the operations).

Human brain work the same way- it may be able to take in multiple informations (sight, feel, sound, smell) at the same time, but human brain has adapted a "filtering" system for unimportant sensor input. Thus you cannot say human brain does parallelistic operations at the same time.

Re:huh? (3, Informative)

Bloater (12932) | more than 9 years ago | (#12946821)

> Last time I checked 'computer brain' (cpu) cannot do multiple operations at the same time, unless you have dual core/cpus.

Yes it can, many have several ALUs and FPUs, and also more than one stage in their pipelines. The above hasn't been true since sometime in the nineties at the latest.

This sounds familiar (1)

rongage (237813) | more than 9 years ago | (#12946725)

Sounds like the elusive "analog computer".

"Shades of grey" sounds like working with analog values (i.e. 0-255) instead of binary levels (on/off) or even trianary values (on/maybe/off).

Re:This sounds familiar (4, Interesting)

Sir Pallas (696783) | more than 9 years ago | (#12946889)

Analog computers still exist in some places, but you list discrete values. An analog computer works with an essentially continuous range of charges instead of discrete values; and it works continuously in time, instead of in discrete steps. They're very good at integrating, which is the application I used them in.

Other factors involved? (1)

Sv-Manowar (772313) | more than 9 years ago | (#12946732)

I dont think its suprising to anyone that the mind works in an analogue fashion, weighing the choices available to it up as the decision is made, but I think this experiment is interesting in measuring the effect through physical reaction to verbal triggers. By using that many core subsystems of the brain, I think its possible that effects could have been drawn into the experiment that are not wholly connected to the input/output streaming methods within the brain, and more to do with physical operation of the mouse or visual stimulus through the screen.

New Research Shows (1)

FLAGGR (800770) | more than 9 years ago | (#12946735)

.. that not only do we think in 0's and 1's, but we have 2's and 3's as well!

Misleading (4, Insightful)

rjh (40933) | more than 9 years ago | (#12946737)

The article's summation is far more accurate than Slashdot. In TFA, a researcher says our minds don't work like digital computers.

The Slashdot headline says our minds don't work like computers, end of sentence.

Had TFSH (The Fine Slashdot Headline) been accurate, this would've been a mind-blowing result and in need of some extraordinarily strong evidence to support such an extraordinary claim. The question of whether the human mind--sentience, consciousness, and all that goes with it--is a computable process is one of the most wide-open questions in AI research right now. It's so wide-open that nobody wants to approach it directly; it's seen as too difficult a problem.

But no, that's not what these guys discovered at all. They just discovered the brain doesn't discretize data. Significant result. Impressive. I'd like to see significant evidence. But it's very, very wrong to summarize it as "our brains don't work like computers". That's not what they proved at all.

Just once, I'd like to see a Slashdot editor read an article critically, along with the submitter's blurb, before posting it.

Re:Misleading (1)

vanyel (28049) | more than 9 years ago | (#12946831)

What came to mind after I read the article was that their results looked like the behavior you'd expect from a standard tree search in a digital computer program, if you moved the mouse according to each branch decision...

0s and 1s (1)

Tweak232 (880912) | more than 9 years ago | (#12946738)

What do you mean our brains do not work like computers, granted they are different, but the basic fundamentals are the same. We transmit and interpret data the same way computers do. We use electrical signals, and although the devices that send and interpret these signals are organic, they still only have an off and on. There is no in between. Again I would like to re-iterate that this is only at the very basic level, if you are talking about higher level thinking and operation, of course they are different. We can learn can't we?

Colorblind... (0)

Anonymous Coward | more than 9 years ago | (#12946746)

Knew it before, i'm colorblind...

Tomorrow on the "Painfully Obvious" (5, Funny)

ugen (93902) | more than 9 years ago | (#12946748)

Birds do not fly like airplanes, they continuously wave their wings - and do not have turbines or propellers.

Sure hope my taxes don't pay for that "research".

I'd Just like to say... (1)

pjameson (880321) | more than 9 years ago | (#12946752)

No Shit

Who knew (0)

Anonymous Coward | more than 9 years ago | (#12946754)

We're analog, not digital

The brain is not a computer (5, Interesting)

Space cowboy (13680) | more than 9 years ago | (#12946769)


Does anyone *really* think that computers and the brain work in the same way ? Or even in a significantly similar fashion ?
Like them, we can do several things 'simultaneously' with our 'processors.'

Well, by 'processors', I assume you mean neurons. These are activated to perform a firing sequence on output connections dependent on their input connections and current state, heavily modified by chemistry, propogation time (it's an electrical flow through ion channels, not a copper wire), and (for lack of a better word) weights on the output connections. To compare the processing capacity of one of these to a CPU is ludicrous. On the other hand, the 'several' in the quote above is also ludicrous... "Several" does not generally correspond to circa 100 billion...

No-one has a clear idea of how the brain really processes and stored information. We have models (neural networks), and they're piss-poor ones at that...
  • There's evidence that the noise-level in the brain is critical - that less noise would make it work worse, and the same for more noise. That the brain uses superposition of signals in time (with constructive interference) as a messaging facility.
  • There's evidence that temporal behaviour is again critical, that the timing of pulses from neuron to neuron may be the information storage for short-term memory, and that the information is not 'stored' anywhere apart from in the pulse-train.
  • There's evidence that the transfer functions of neurons can radically change between a number of fixed states over short (millisecond) periods of time. And for other neurons, this doesn't happen. Not all neurons are equal or even close.
  • Neurons and their connections can enter resonant states, behaving more like waves than anything else - relatively long transmission lines can be set up between 2 neurons in the brain once, and then never again during the observation.

The brain behaves less like a computer and more like a chaotic system of nodes the more you look at it, and yet there is enormous and significant order within the chaos. The book by Kauffman ("The origins of order", I've recommended it before, although it's very mathematical) posits evolution pushing any organism towards the boundary of order and chaos as the best place to be for survival, and the brain itself is the best example of these ideas that I can think of.

Brain : computer is akin to Warp Drive : Internal combustion engine in that they both perform fundamentally the same job, but one is light years ahead of the other.

Simon.

The fools (1)

Dirtside (91468) | more than 9 years ago | (#12946776)

The W3C rejected my idea for a "sarcasm" HTML tag, when it would have been so useful at a time like this. Well, I can still fake it:

Our brains don't work like computers? <sarcasm>Noooo, you're kidding!</sarcasm>

At times like this? (1)

exp(pi*sqrt(163)) (613870) | more than 9 years ago | (#12946793)

You mean there are times when sarcasm isn't useful?

In other news ... (0)

Anonymous Coward | more than 9 years ago | (#12946780)

... humans communicate with none of the precision of robots, reproduce in a manner wholly dissimilar to an assembly line and do not benefit from being hooked up to AC electic current.

Tune in tomorrow for more news from the next installment of, "YOU ARE NOT A COMPUTER, GEEK," exclusively on Slashdot.

Yes (0)

Anonymous Coward | more than 9 years ago | (#12946783)

(drone voice)
Yes. This is true. Logical. I concur.
Next up on the obvious channel, are fat people really heavier than skinny ones?

Indeed (1)

smchris (464899) | more than 9 years ago | (#12946795)


People aren't born with an innate foundation in predicate calculus?

I suppose it can be a useful line of research in robotic "muscle" coordination and world interaction.

Against the study (1)

sysbot (238421) | more than 9 years ago | (#12946796)

I think the study is somewhat flawed because first of all interfacing with the computer using a mouse and based the result of the study on that is somewhat open for other possibilities. The main point however is that:

"When there was ambiguity, the participants briefly didn't know which picture was correct and so for several dozen milliseconds, they were in multiple states at once. They didn't move all the way to one picture and then correct their movement if they realized they were wrong, but instead they traveled through an intermediate gray area," explained Spivey

I think what he's trying to say here is that we in difference state which supports his continues study but from what I'm thinking this is merely someone stop and try to compare the two similar words because they are so close so diferentate the two and not because they are in a grey area. For example if I see two words that are long and hard but most of their character are the same then i will take a while to figure out their difference and select the one with the intented meaning. So i think this study studied the wrong thing.

Both are computationally complete so WHO CARES? (2, Insightful)

John.P.Jones (601028) | more than 9 years ago | (#12946804)

I still believe in the Church-Turing Thesis... Our brains might not work LIKE computers but they don't do work DIFFERENTLY than them either.

Can you imagine... (0)

Anonymous Coward | more than 9 years ago | (#12946809)

A Beowulf made from these things?

umm (1)

ImTheDarkcyde (759406) | more than 9 years ago | (#12946826)

are they at least Turing Machines?

Re:umm (1)

cynic10508 (785816) | more than 9 years ago | (#12946882)

Brains: doubtful. Minds: definitely not.

Binary Not The Best (1)

Alphanos (596595) | more than 9 years ago | (#12946828)

"the researchers found that our learning process was similar to other biological organisms"

Why was this a suprising result? Prior to this they thought what, that people were human-made binary computers in diguise? We have developed computer systems using binary math not because a binary system of logic is necessarily the best, but because binary components can be made easily and cheaply.

Also, figuring out a system of low-level operations such as NAND and XOR is more difficult for other number systems like decimal.

Yes, but... (0)

Anonymous Coward | more than 9 years ago | (#12946830)

Sorry, I have to know...

WILL IT RUN LINUX?

perhaps in a beowulf cluster?

Re:Yes, but... (1)

Rod Beauvex (832040) | more than 9 years ago | (#12946916)

More importantly, could a cluster of us run Linux?

01001000010101010100100000111111 (huh?) (1)

bosewicht (805330) | more than 9 years ago | (#12946839)

01010111010010000100000101010100001111110010000001 00100100100000010001000100111101001110001001110101 01000010000001010101010011100100010001000101010100 100101001101010100010000010100111001000100 (What? I don't understand)

Of course not (0)

Anonymous Coward | more than 9 years ago | (#12946841)

A human brain can easily spot the problem in the following code:

int i = 1;
while(i > 0) {
i++;
}

It is provably impossible for a computer to detect the problem there in any sort of general fashion.

Clearly there is something fundamentally different between the two devices.

From the no shit department (1)

Lord Haha (753617) | more than 9 years ago | (#12946850)

Wow, in other news we dont move the same way as cars!

Just wait this just in...

Not everyone agrees with Bush either!

- Comeon Slashdot you can do better then this

ridiculous! (0)

Anonymous Coward | more than 9 years ago | (#12946859)

This is such a ridiculous article. Not only is the hypothesis completely pointless (Brains behave like biological organisms), but the experiments they did have no bearing on this finding. Hasn't it been proven that phonemes are the lowest common denominator of language? And since when did the location of a mouse pointer accurately demonstrate the methods of the brain?

"it's a dynamical system" (1)

homeobocks (744469) | more than 9 years ago | (#12946861)

What's up with that? credibility--;

There was an article in SciAm mind (1)

Pinefresh (866806) | more than 9 years ago | (#12946874)

that talked about how for the brain every neuron firing was more like a line of code executing than a 1 or 0. ive visualised it like that since i read that, although im sure its nowhere near that simple.

Confusing verbage (1)

photon317 (208409) | more than 9 years ago | (#12946896)


They talk in the article of a "1's and 0's" concept of brain function, but they fail (at least through what is in the PR release about their experiment) to disprove that the brain operates on binary data.

Even computer software, which is known to operate on a strict binary system at the lowest layers, can have the appearance of linear, curving outputs as the data fed to it changes. This linearity breaks down at some granularity if you look closely enough at the output and see it jumping from one value to the next value at some minimum discreet distance away.

Unless they think that watching a student draw a curve on a screen with a mouse provides them so hi-definition a picture of the brain's decision making that they could see the granularity, the experiment is meaningless. Chances are high that even in a brain which operates on discreet binary information at the lowest level, the "output resolution" of something like mouse motor skills is capable of being considerably finer-grained than the resolution of the mouse being operated (or the muscles controlling it).
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>