Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

A New Look At Brain Control

Soulskill posted more than 4 years ago | from the can-we-do-it-at-a-distance-yet dept.

65

one_neuron_two_neuron writes "Researchers at Harvard have taken a new look at how electricity can make neurons fire in the brain. The scientists found some surprising things: if you stick an electrode in the brain and apply current, you don't just make a small group of neurons fire — many neurons fire a long way away from the electrode. That's probably because instead of activating the cell bodies of the neurons, their axons fire. Those axons are the wiring of the brain. Your cerebral cortex is something like a big pile of unwound yo-yos — if you stick an electrode into the cortex, you're much more likely to hit the strings (the axons), and the yo-yo connected to the string can be really far away. So, how will you ever hook up a computer to your brain? This data shows that we need to rethink how to do that with electrical current. If you stick an electrode in one place, neurons in a totally different place will fire. New optogenetic methods (e.g. using viral delivery of proteins) might work. Or possibly we will figure out how to make the brain learn to interpret these sparse, widespread electrical patterns. New optical techniques have made a dramatic impact on neuroscience recently, and this study uses pulsed-laser-scanning microscopy (two-photon microscopy) to take pictures of neurons deep inside the living brain. The academic paper (PDF) is available on the author's site."

cancel ×

65 comments

Sorry! There are no comments related to the filter you selected.

That's nothing. (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#29239593)

I can make you faggots mod me down. Go ahead, submit to my control! Jews did 9/11!

Re:That's nothing. (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#29239741)

Posting anon because I modded you down.

What you must understand about myself and my fellow moderators is that we are regularly given the power to censure your speech. We are blessed with that ability because we have demonstrated that we are sycophantic cocksuckers with low I.Q. and therefore unable to think for ourselves or formulate any cognition which does not conform to Slashdot groupthink.

I am content with this. I enjoy cracking a glimmer of a smile when I score that pagelong +5 copypaste first post I had waiting for exactly the right topic, or when I moderate down somebody who speaks of such horrors as niggers, Jews' world domination, and Rob Malda's micropenis among other unpleasantries.

Please help me piss away my mod points. They were awarded for a reason, and I would not feel useful without assholes like you.
 
 

Trollin,

Got my mod points in,

keep trollin,

about nigger men,

first posts...eve-ry-day,

Niggers Association of America, Gay-ee-AY!

linux (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#29239603)

Welcome to Niggerbuntu

Niggerbuntu is a Linux-based operating system consisting of Free and Open Source software for laptops, desktops, and servers. Niggerbuntu has a clear focus on the user and usability - it should Just Work, even if the user has only the thinking capacities of a sponge. the OS ships with the latest Gnomrilla release as well as a selection of server and desktop software that makes for a comfortable desktop experience off of a single installation CD.

It also features the packaging manager ape-ghetto, and the challenging Linux manual pages have been reformatted into the new 'monkey' format, so for example the manual for the shutdown command can be accessed just by typing: 'monkey shut-up -h now mothafukka' instead of 'man shutdown'.

Absolutely Free of Charge

Niggerbuntu is free software, and available to you free of charge, as in free beer or free stuffs you can get from looting. It's also Free in the sense of giving you rights of Software Freedom. The freedom, to run, copy, steal, distribute, study, share, change and improve the software for any purpose, without paying licensing fees.

Free software as in free beer !

Niggerbuntu is an ancient Nigger word, meaning "humanity to monkeys". Niggerbuntu also means "I am what I am because of how apes behave". The Niggerbuntu Linux distribution brings the spirit of Niggerbuntu to the software world.

The dictator Bokassa described Niggerbuntu in the following way:

        "A subhuman with Niggerbuntu is open and available to others (like a white bitch you're ready to fsck), affirming of others, does not feel threatened by the fact that other species are more intelligent than we are, for it has a proper self-assurance that comes from knowing that it belongs to the great monkey specie."

We chose the name Niggerbuntu for this distribution because we think it captures perfectly the spirit of sharing and looting that is at the heart of the open source movement.

Niggerbuntu - Linux for Subhuman Beings.

Computers? (0, Troll)

BrokenHalo (565198) | more than 4 years ago | (#29239607)

So, how will you ever hook up a computer to your brain?

Who wants to? I can't think of anything dumber.

Re:Computers? (3, Funny)

jakartus (1287248) | more than 4 years ago | (#29239625)

Without a computer hooked onto your brain, how else would you be able to relate to a female computer and have sex with it? Duh.

um what moron really wants one attached.... (1)

CHRONOSS2008 (1226498) | more than 4 years ago | (#29240245)

to his noggin, this is just stupid technology so they can make a robot and put us all on a welfare line.

Re:Computers? (1)

Sawopox (18730) | more than 4 years ago | (#29239627)

Most likely, it will be firewire. Also, hooking an electromechanical computer to a biochemical computer might turn into something fairly awesome.
With some type of ray gun or beam weapon and hardened carapaces.

Re:Computers? (1)

dangitman (862676) | more than 4 years ago | (#29239697)

Most likely, it will be firewire.

So, it burns?

Re:Computers? (5, Insightful)

EdZ (755139) | more than 4 years ago | (#29239665)

Who wants to? I can't think of anything dumber.

With the power of a computer attached to your mind, you would be able to think of many things that are dumber, several million times per second!

Re:Computers? (0)

Anonymous Coward | more than 4 years ago | (#29240107)

Hell I can't wait until I can simulate having a girlfriend real enough all in my head so I won't be lonely any more!

Re:Computers? (1)

LoRdTAW (99712) | more than 4 years ago | (#29240763)

I do that every day and I don't need no stinking computers. Sigh.

Re:Computers? (0)

Anonymous Coward | more than 4 years ago | (#29249343)

*sigh* kids these days... can't even masterbate without a computer telling them how!

Re:Computers? (0)

Anonymous Coward | more than 3 years ago | (#29266289)

You and 50 000 000 Japanese people.

Re:Computers? (5, Insightful)

KarrdeSW (996917) | more than 4 years ago | (#29239671)

So, how will you ever hook up a computer to your brain?

I thought the keyboard already solved this problem... It's not quite direct but it seems to work.

Re:Computers? (3, Insightful)

bmgoau (801508) | more than 4 years ago | (#29240093)

Not quite direct?

It's like trying to get water from one bucket to another, but instead of using a pipe, having to turn it into steam and use a tennis racquet to force it into its new bucket.

Think of all the unnecessary processing that goes into the hand and eye coordination for data entry into a computer, or for the opposite case: all the visual processing that brain needs to do to in order to recognise symbols on a screen, form words and subsequently string those into coherent thoughts. Its utterly and completely inefficient. The only reason we find it acceptable is because at the moment it's the best system we have.

If we could either directly interpret thoughts accurately or feed information into the brain (a task of incredible complexity) we could drastically speed up all sorts of information processing tasks.

Re:Computers? (1)

dov_0 (1438253) | more than 4 years ago | (#29240241)

Mechanical data entry is probably much easier than a direct linkup though. Unless you implant the connections in a child way before birth - ie before the brains wires itself for the mechanical way of doing things exclusively. How would you learn to use a direct linkup with a computer? Not a direct concept at all. Easier to grow a child with it already in place so they can learn to use it over a decade or so - just like we learn to use our hands so well.

Re:Computers? (1)

Ifandbut (1328775) | more than 4 years ago | (#29240715)

You learn by doing. Much the same way you learn to type. You start off chicken pecking and a with some work you start to type faster and faster and faster. You type faster because your brain adapts and no longer needs visual input to know where the keys are, your finger muscles get stronger so you can move your hands faster.

Another example, games. I used to only be able to play first person shooters on the PC. I was one of those people who said "you cant get better then a mouse and keyboard" and "there is no way you can play a fps on a console". Then Halo 2 came out while I was in college, and everyone had to play it. I did not want to at first cause I though I would suck. And I did suck. I could bearly look and move at the same time. However, after playing it for a month I started to get used to moveing and aiming with a controller. I started likeing the fact I could go from a slow walk to a full run just by tilting the stick more. Today I can pick up a console shooter and play it as naturaly as I can a shooter on the PC.

With the brain interface I dont think it will be much different. At first you will really have to focus on each letter or word to get it to display but after a while it would get eaiser and eaiser for words to just appear on screen and before you know it you are makeing fully formed paragraphs with no more effort then simply thinking about what you want it to say.

Re:Computers? (1)

tedr2 (1502807) | more than 4 years ago | (#29241075)

either directly interpret thoughts accurately or feed information into the brain

That is a call for i/o engineers right ?

Re:Computers? (1)

radtea (464814) | more than 4 years ago | (#29244319)

If we could either directly interpret thoughts accurately or feed information into the brain (a task of incredible complexity) we could drastically speed up all sorts of information processing tasks.

Why do you believe that?

You insist that we "think of all the unnecessary processing" that goes on, but give us no reason to believe that it is unnecessary other than, presumably, the fantasy that it can be dispensed with. Why do you believe that it can be dispensed with? The most likely case seems to me that it can be externalized, so that a computing device will interface with our neural networks and neuro-chemical systems somewhat further upstream than it does now.

In particular, I have no idea at all what you mean by "directly interpret thoughts". What is a "thought" in this context? There is a huge amount of stuff going on in the brain at any given time, and no reason to believe that any of it can be mapped onto an isolated entity that could reasonably be labelled a "thought" by any means other than looking at the operational outputs of our conventional interfaces: our words (spoke, written, or internally recited) and actions.

You are bringing an huge and utterly unjustified load of assumptions to your argument, and if some doesn't buy into your peculiar model of the brain, which I don't, then what you're arguing for it just silly and your conclusion that somehow externalizing the transformation from internal neuro-chemical and neuro-electrical behaviour to operational effects could "drastically speed up all sorts of information processing tasks" is entirely unjustified.

The brain is primarily a neuro-chemical machine--the neuro-electrical aspects of it that computer people tend to focus on are a very small layer on top of the underlying neuro-chemistry. And the chemistry has time constants associated with it that are adequate for information processing using the operational inputs and outputs we have: there is no reason to believe that bypassing those systems would result in any speedup at all. Indeed, it would be evolutionarily bizarre if the brain were capable of doing much more than its current speed, because there cannot ever have been any evolutionary pressure of any kind to select for that.

Your argument is like saying that if we put a couple of jet engines on a steam locomotive it would be able to break the speed of sound, ignoring things like the mechanical limitations of th wheels and the tracks: you're focused on one obvious limiting factor in the body/brain identity's performance, and supposing that because you can imagine the rest of the system is capable of much faster speeds that it actually is.

Re:Computers? (2, Funny)

dkleinsc (563838) | more than 4 years ago | (#29241701)

A keyboard. How quaint. -cracks knuckles-

Re:Computers? (0)

Anonymous Coward | more than 4 years ago | (#29249781)

As someone who seems to be prone to tendinitis, I respectfully disagree.

Re:Computers? (2, Insightful)

Zen Hash (1619759) | more than 4 years ago | (#29239693)

So, how will you ever hook up a computer to your brain?

Who wants to? I can't think of anything dumber.

People have always said that about new technology. Eventually someone comes up with a killer application and it takes off.

Re:Computers? (0)

Anonymous Coward | more than 4 years ago | (#29239825)

Name one common tech that directly interfaces with the body.

Re:Computers? (1, Informative)

Anonymous Coward | more than 4 years ago | (#29239879)

the mouse.

dildo.

anal beads.

nasal spray containers.

gerbil.

Re:Computers? (0)

Anonymous Coward | more than 4 years ago | (#29241917)

gerbil.

nice try, but before you keep propagating that myth, you might want to see what Cecil has to say about the Richard Gere and the gerbil [snopes.com] .

Re:Computers? (1)

eyepeepackets (33477) | more than 4 years ago | (#29239903)

Toilet paper is common tech and the application is most certainly direct.

Re:Computers? (2, Informative)

ozydingo (922211) | more than 4 years ago | (#29239949)

If you want to get as specific as neural stimulation, then cochlear implants, depending on how common you mean. But I'm glad to be in a world that has that technology. It's also very highly analogous and can benefit from the same research, as it involves direct electrical stimulation to a population of neurons. One of the limiting factors in cochlear implants' success is that it is currently (no pun intended) impossible to stimulate anything close to a narrow, precise range of auditory neurons; thus frequency resolution in CI users suffers dramatically. We have something like 30,000 auditory neurons along on almost continuum of frequencies; studies tend to show that with a CI you can have up to a maximum of 8 separate frequency channels before performance saturates.

Re:Computers? (2, Interesting)

agnosticnixie (1481609) | more than 4 years ago | (#29241207)

It also requires extensive (and dangerous) surgery, has a high rate of failure due to poor education techniques, and the machines themselves have a high rate of failure. They're extremely rare and tend to be uncomfortable, causing rashes every once in a while. They're also uncomfortable to keep on.

Re:Computers? (1)

Ultra64 (318705) | more than 4 years ago | (#29243105)

So true. It's too bad that once something is invented it's impossible for it to ever be improved.

Re:Computers? (1)

agnosticnixie (1481609) | more than 4 years ago | (#29243215)

The technology has improved, it still remains slapping a computer in your cochlea, the discomfort will not go away, the rates of failure have not gone away, and the surgery is still done in a ridiculously sensitive part of the body.

Re:Computers? (1)

TheLink (130905) | more than 4 years ago | (#29242145)

If you start improving cochlear implants significantly (better fidelity etc) and attaching computers to them, I'm sure the RIAA and gang will have a word with you, and "convince" you to include DRM.

So will it be a penny for your ("their") thoughts? I'm thinking they might charge more.

Re:Computers? (1)

FooAtWFU (699187) | more than 4 years ago | (#29239861)

Or it doesn't, and it fades into obscurity like countless non-notable technologies.

Re:Computers? (1)

Zen Hash (1619759) | more than 4 years ago | (#29239971)

Or it doesn't, and it fades into obscurity like countless non-notable technologies.

Applications of the technology may fail to catch on and fade into obscurity, but that doesn't prevent someone else from rediscovering using the technology in a new application later on.

Re:Computers? (3, Funny)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#29239937)

Though, in this case, the killer application is going to be heavier on the "transhuman cyborg killbot" and light on the "visicalc"...

Worst. Analogy. Ever. (1, Insightful)

dangitman (862676) | more than 4 years ago | (#29239691)

Your cerebral cortex is something like a big pile of unwound yo-yos

WTF? Why don't they just say "the brain is a big pile of neurons and axons"? It would be more helpful than this bizarre analogy.

Re:Worst. Analogy. Ever. (4, Funny)

ZackSchil (560462) | more than 4 years ago | (#29239749)

It's not a big pile of neurons and axons! It's a series of yo-yos!

Re:Worst. Analogy. Ever. (1)

lennier (44736) | more than 4 years ago | (#29255185)

A series of yo-yos linked together by Twitter.

Old News (0)

Anonymous Coward | more than 4 years ago | (#29239703)

A new look at brain control... in other words, what the government's been doing for the last 30+ years. Ever heard of the little CIA project called MK-ULTRA?

Typing with two fingers (5, Funny)

ZackSchil (560462) | more than 4 years ago | (#29239731)

It makes me sad to think that some day, I'll be stumbling around with my brain machine interface, creating thought streams one at a time and accidentally thinking of something else halfway through, losing my work, and asking the kids for help.

And they'll look me in the eye and feign sympathy as they blast high-frequency shorthand thoughts back and forth to one another, mocking my generation for being so dumb that its members can't even work a brain port properly.

Re:Typing with two fingers (3, Funny)

drseuk (824707) | more than 4 years ago | (#29239955)

Remember to think in Soviet Russian if you're in Firefox.

Re:Typing with two fingers (1)

BobisOnlyBob (1438553) | more than 4 years ago | (#29239997)

I'm still typing with two fingers, having never learned to touch-type and generally having enough muscle memory to work with what I've got. Sure, my WPM is dreadful and I'll never be a typist, but for programming at the scale I do, it's less about vast pages of code and more about knowing what I'm writing.

I'm also a strong advocate of Brain-Computer Interface technology, although whether it's out of reluctance to re-learn typing or a desire to push the boundaries of the human condition, well... ...a little from column A, a little from column B...

Re:Typing with two fingers (0)

Anonymous Coward | more than 4 years ago | (#29240341)

It makes me sad to think that some day, I'll be stumbling around with my brain machine interface, creating thought streams one at a time and accidentally thinking of something else halfway through, losing my work, and asking the kids for help.

Don't worry, your not that far away from twitter ;)

Magnetic Fields? (0)

Anonymous Coward | more than 4 years ago | (#29239819)

What if we just give people a magnet to wave around their heads...would that work, too? You know, with the induced current and all...

Posting from Nickelback. (-1, Offtopic)

symbolset (646467) | more than 4 years ago | (#29239959)

I have never heard acoustic piano amplified this loud. The crowd is eating it up.

You people disgust me.

Take It Back To Dr. Reid (1)

DynaSoar (714234) | more than 4 years ago | (#29240061)

From the sophomoric language in the summary (my apologies to real sophomores, like the ones in high school) and the fact the paper hasn't been submitted anywhere and so who else would have access, I conclude that it is the first author (a student), not the second (a post doc that did the matlab work) or third (an MD/PHD) that posted this here. Calling yourself a researcher and not making your student status clear is a failure of full disclosure. It also works against you in that people will forgive a students a lot of mistakes that they'll end up holding against you otherwise.

In your hurry to promote your ideas you gloss over or ignore a large body of research that disproves your assertions.
- We've known for a long time about sparse networks and distal activation. The distribution is due to Hebbian assemblies. Hebb, Donald O.
- We can and do stimulate and record nearby spatially and temporally. Use the stimulation pulse as the deactivation signal on the recording probe. We can even do them simultaneously with high impedance, high speed data collection equipment and adjusting the output Y scale to logarithmic. Ask Vince to explain this last part. We also use magnetic stimulation and shielded electrical probes.
- Inadequate and poorly stated background material is a sure way to not only get rejected but also get remembered by those editors and rejected in the future. You've got to assume that not only will some of them know about the field, some of those reading your submission may be among those being misrepresented or those who you fail to represent but should have.
- The above goes double for trying to make your idea sound good by making others sound bad. It pisses people off, and you may be wrong (you are) which just gets your manuscript returned.
- You fail to show that your method can be generalized to normal neuron activation, and thus be useful for (the inadequately described) computer interface, or anything else that requires 'normal' operation. And not being able to show normal operation means not being able to tell if and/or when your method causes incorrect operation or damage.
- The background theory and history shortcomings call into question your theory, the need for your technique, and the usefulness of your large volume of supporting data and graphics. This last part looks like argumentation by bafflement - flooding the reader with overly technical details in order to bullshit them into accepting your other points. Even if it's not that, it looks like it.

Does Dr. Reid know this was posted here?

Re:Take It Back To Dr. Reid (1, Informative)

Anonymous Coward | more than 4 years ago | (#29240169)

Does Dr. Reid know this was posted here?

Do you know that this was published in Neuron?

Neuron. 2009 Aug 27;63(4):508-22.

Re:Take It Back To Dr. Reid (0)

Anonymous Coward | more than 4 years ago | (#29240337)

Are we sure this summary wasn't just written by someone with active electrodes shoved into their head?

Re:Take It Back To Dr. Reid (0)

Anonymous Coward | more than 4 years ago | (#29245717)

And who are you?

Re:Take It Back To Dr. Reid (0)

Anonymous Coward | more than 4 years ago | (#29246369)

I suspect the postdoc :p

Re:Take It Back To Dr. Reid (0)

Anonymous Coward | more than 4 years ago | (#29275685)

OK. Deep breath, DynaSoar.

The first author did this work as a post-doc in the Reid lab, not a student, and the paper has already been published in a well-regarded peer-reviewed journal. The language in the Slashdot summary appears to come directly from the linked article. So there goes your first paragraph, including any evidence for your conjecture that the first author wrote the summary, and not some science blogger. (Full disclosure: I am an acquaintance of the first author, but I don't know if he submitted the story.)

Mixed in with your grousing about the authors' imputed attitude, you make 3 scientific points: 1) The results are not novel because we already knew about Hebbian cell assemblies. 2) Overcoming stimulation artifacts is not important because there are known workarounds. 3) The 'method can not be generalized', and is therefore inappropriate for brain-computer interfaces.

Let's look at these one by one.

1) The claim made in the paper is explicitly *not* that stimulation drives activity across Hebbian cell assemblies (i.e. due to synaptic propagation of activity through the network). Rather, the authors claim that electrical stimulation *directly* activates a sparse subset of cells in quite a large area around the electrode tip. Their optical methods don't have the temporal resolution to measure latencies to activation (which would be the gold standard), but this claim is supported by their finding that these patterns of activation are seen when excitatory transmission was blocked.

2) While there are known workarounds to deal with stimulation artifacts, optical imaging doesn't even have this problem to work around to begin with. Furthermore, this is only one of the advantages of using optical imaging to measure the effects of stimulation. No other method can simultaneoulsy and comprehensively monitor the activity of hundreds of cells near the stimulation site?

3) I'm not sure what method you are talking about. The authors argue that their work helps us better understand the effects of electrical stimulation in the brain, and argue that this improved understanding will be helpful in the design of new electrical stimulation techniques. They do not claim that 'their method' will be useful interfacing brains and computers. In fact, in the last couple paragraphs of their discussion, where authors are usually at their most speculative, they argue for caution--stating for instance that high-resolution electrical stimulation of visual cortex might not be possible, for instance. Hardly a snow job, as you imply.

You come across as haughty, bitter, and unwilling to carefully read new work that may challenge your assumptions. I suggest that rather than looking down your nose at 'students', you might do well to recall a little of the humility and openness to new ways of addressing a problem that you presumably felt when you were a student.

instant education (0)

Anonymous Coward | more than 4 years ago | (#29240419)

i just want to be able to download stuff to my brain, kinda like in the matrix, maybe a bluetooth chip embeded in my head

Wave interference? (0)

Anonymous Coward | more than 4 years ago | (#29240423)

If you look at the water surface during the rain, you will see waves crossing all the time.

Now imagine that in the brain you have the response exactly in places where several waves cross to merge into one temporal peak. Thus, it is important to have a frequency that can be modulated... Now imagine that you have billions of neurons (drops of water) and trillions of places where waves cross to form signal strong enough to induce another wave.

I am not a brain researcher, but can you reproduce this chaos using light interference to create artificial brain?

How the brain works (0)

Anonymous Coward | more than 4 years ago | (#29240469)

As a former neuroscientist I can attest to the difficulty of stimulating one neuron specifically. One needs to realize the concept of stimulation by and electrode is completely meaningless without a proper understanding what brain activity normally looks like. It doesn't look like one 'sending' neuron in an otherwise tranquil environment. It looks like an ant hill of signals with groups of 'sending' and groups of 'listening' neurons, changing roles all the time. So even when you are over the surprise that you may be poking an axon instead of a neuron soma, there is still a long way to go before you get what you are poking into..

Why not use nerves? (0)

Anonymous Coward | more than 4 years ago | (#29240479)

If you're trying connect a computer to the brain, why not connect it through the nerves. Sounds way more pleasant than brain electrodes.

Who knew? (1)

RichardJenkins (1362463) | more than 4 years ago | (#29240889)

Your cerebral cortex is something like a big pile of unwound yo-yos

Yeah, great. Thanks for clearing that one up Slashdot.

Re:Who knew? (0)

Anonymous Coward | more than 4 years ago | (#29241241)

Well it's more accurately described with a Car analogy, but the Yo-Yo one gets the idea across....

This is damn cool idea, but... (0)

Anonymous Coward | more than 4 years ago | (#29241579)

I would've started SolarSidewalks, or SolarPaths. There has to be nearly just as much or a shit-ton more paths than roads around the world. It's a lot safer, considering humans can deal with a slippery surface pretty well while on foot (unless you're running). You can install a long lengths of concrete where you need the most traction, like intersections and inclines. And paths are generally cleaner than roads by far. also allowing the panels to collect more light.

And, it probably wouldn't have to include a huge price tag in comparison to a major overhaul of a busy road infrastructure.

Not totally impressed with the summary (1)

monoqlith (610041) | more than 4 years ago | (#29242249)

Particularly the language "totally different place."

Neurons in a "totally different place" will fire....this kind of language doesn't comfort me about the prospect of sticking electrodes into my brain. "We're not totally sure which neurons we're firing. It could be the ones we're touching with this little wire or it could be ones in a totally different place.. I guess we'll find out, huh?!?!"
ï
If my neurosurgeon said that to me as he started drilling into my skull I would be wanting to strongarm my way out of the OR....
ïIf only I could get my head out of this damn Clockwork Orange device...

I have a new non-respect for Harvard (1)

joeyblades (785896) | more than 4 years ago | (#29243499)

This is so patently obvious that to call it a new discovery is just silly. Who conducted this study at Harvard; the accounting department?

Anyone involved in brain science knows how interconnected the brain is and how remotely connected neurons are. Why would anyone assume that forcing one neuron to fire from an electrical stimulus would only affect neurons locally?

Re:I have a new non-respect for Harvard (1)

neurocutie (677249) | more than 4 years ago | (#29248545)

1) Harvard has nothing to do with this...
2) It is a "discovery", well at least a new result and new finding because the method used shows EXACTLY what happens with electrical stimulation, not just guesses.
3) Previously in the field, the "guesses" (assumption) was that there would ROUGHLY be a sphere of activated neurons surrounding the electrode tip. The results show that this assumption is very wrong and the actual neurons activated are highly idiosyncratic and difficult to predict. MANY previous studies that have been VERY influential in the field have been based on this loose assumption, now shown to be incorrect.
4) Yeah, in hindsight, one could always say "no surprise", given the wide variety of neurons, their properties, excitation states, channel complement, etc, that it is no surprise that the neurons activated would be "all over the place", but at least there now is a way to know EXACTLY what is being stimulated (or not be stimulated)...
5) These kinds of experiments are quite difficult to perform, requiring at least half a million $$$ of equipment or more, and years of training and expertise. Indeed have NEVER been done before. So they should be view with some appreciation and respect...

Re:I have a new non-respect for Harvard (1)

joeyblades (785896) | more than 4 years ago | (#29249257)

You say

Previously in the field, the "guesses" (assumption) was that there would ROUGHLY be a sphere of activated neurons surrounding the electrode tip.

I know the article stated this, but this is clearly not the case. As early as 1874 Roberts Barlow demonstrated that electrical stimulation of neurons triggered muscular contractions. These kinds of experiments have been going on with rats, cats, dogs, monkeys, and a veritable menagerie of other critters (including humans) for more than 100 years. It has been well understood through experimentation that electrical stimulation of one area of the brain triggers the firing of other neurons elsewhere in the brain. That whole business about "balls" of affected neurons is so patently wrong, I can only assume that they were trying to make some other point...

The only interesting, newsworthy part of this is the ability to observe this behavior at a cellular level using new imaging tools.

I would have been a lot less dismissive if the headline would have been something along the lines of:

New optical imaging technology used to show unprecedented detail of neuron behavior

Re:I have a new non-respect for Harvard (1)

neurocutie (677249) | more than 4 years ago | (#29249385)

As early as 1874 Roberts Barlow demonstrated that electrical stimulation of neurons triggered muscular contractions.

You're confusing the activation of neurons local to a stimulating electrode vs the propagation of a signal (e.g. action potential) via the axons (nerves) from such neurons to their targets (which may be other neurons or muscles, near or far away).

It has been well understood through experimentation that electrical stimulation of one area of the brain triggers the firing of other neurons elsewhere in the brain.

Again you're missing the point. The article deals with the INITIAL patterns of activation at the stimulating site and not the subsequent patterns of activation that are the projection targets of the neurons and fibers first activated.

Re:I have a new non-respect for Harvard (1)

joeyblades (785896) | more than 4 years ago | (#29251105)

You're right. My bad.

No shit Sherlock? (1)

Hurricane78 (562437) | more than 4 years ago | (#29243675)

you don't just make a small group of neurons fire -- many neurons fire a long way away from the electrode. That's probably because instead of activating the cell bodies of the neurons, their axons fire. [...] and the yo-yo connected to the string can be really far away.

Was this ever not blatantly obvious?
I'm always shocked by how unobvious the whole concept of the brain is to even the very "scientists" who work on it.
They seem to always be caught in some tiny box and/or focusing on details that are only artifacts of the foundational rules.
I'm sorry, but trying to identify what a specific part of the brain does (except for some really specific parts like the cerebellum), is a pointless exercise, that shows that you don't really understand the point of the way a neural network works.
It is like looking at the digital data inside a memory chip, instead of the wires themselves, to understand how the chip itself works. You can gain something from it. But why use such a bad point of view?

And this is even worse: I mean, who here seriously did not think that when you stick an electrode in your brain, the electricity will run down the axons it touches, to land anywhere that axon ends (Which sometimes can be more than 2 or 3 feet away somewhere down/up the spine!)
What did they think before? That it only touches the cells themselves? That's like only touching the fruits inside of a rolled up raspberry bush.

Oh, and by the way: Yo-yos? Really? That's the worst analogy ever. Or did you do that to foster "Yo dawg, [...] yo momma and yo yo ma's yo-yo [...]" jokes?

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>