Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

New Imaging Technique Helps Explain Unconsciousness

Soulskill posted more than 2 years ago | from the beware-the-unexpected-nap-attack dept.

Medicine 78

smitty777 writes "A new imaging technique called fEITER (for functional Electrical Impedance Tomography by Evoked Response) attempts to explain the process of slipping into unconsciousness. The fEITER is a portable device that creates 3D imagery based on evoked potentials measured hundreds of times a second. The interesting finding from these studies is that unconsciousness appears to result from a buildup of inhibitor neurons. From the article: 'Our findings suggest that unconsciousness may be the increase of inhibitory assemblies across the brain's cortex. These findings lend support to Greenfield's hypothesis of neural assemblies forming consciousness.'"

cancel ×

78 comments

Sorry! There are no comments related to the filter you selected.

No comments? (0)

Anonymous Coward | more than 2 years ago | (#36486624)

Seems like Slashdot is unconcious today.

Re:No comments? (1)

Geldon (444090) | more than 2 years ago | (#36486640)

Must be a build-up of inhibitory assemblies.

Re:No comments? (1)

bughunter (10093) | more than 2 years ago | (#36486662)

Or maybe just a buildup of sunshine and nice weather outside.

Re:No comments? (1)

Anonymous Coward | more than 2 years ago | (#36486670)

"Outside"?

Re:No comments? (1)

Anonymous Coward | more than 2 years ago | (#36486684)

That big room with the leaky roof and the big light.

Re:No comments? (0)

Anonymous Coward | more than 2 years ago | (#36486698)

Oh that? Only habitable for a few hours at a time - there's nowhere to plug in an AC charger!

Re:No comments? (1)

PopeRatzo (965947) | more than 2 years ago | (#36487088)

Oh that? Only habitable for a few hours at a time - there's nowhere to plug in an AC charger!

You don't have a portable solar collector and transformer? Or at least a Pasco hand crank generator? Are you sure you belong here at /.?

Re:No comments? (0)

Anonymous Coward | more than 2 years ago | (#36487274)

conscious or unconscious, is there really a difference? it's all one in the same, isn't it?

Re:No comments? (1)

WillKemp (1338605) | more than 2 years ago | (#36487488)

conscious or unconscious, is there really a difference? it's all one in the same, isn't it?

For some, yes.

Re:No comments? (1)

Abstrackt (609015) | more than 3 years ago | (#36488226)

I see you're from the 60s too!

not really new (0)

Anonymous Coward | more than 2 years ago | (#36486692)

The spatial resolution of any scalp-based recording device is insufficient to say anything very informative about consciousness or assemblies, which were proposed by Donald Hebb in the 1940's to be the basis of information processing in the brain. We know much more about this process from animal models in which allow for recording the activity of neurons during sleep/wake/problem solving.

would`ve been interesting enough to quote (0)

Anonymous Coward | more than 2 years ago | (#36486696)

“We have been able to see a real time loss of consciousness in anatomically distinct regions of the brain for the first time. We are currently working on trying to interpret the changes that we have observed, as we still do not know exactly what happens within the brain as unconsciousness occurs, but this is another step in the direction of understanding the brain and its functions.”

The device is as newsworthy as the results (5, Insightful)

bughunter (10093) | more than 2 years ago | (#36486700)

FTFA:

The machine itself is a portable, light-weight monitor, which can fit on a small trolley. It has 32 electrodes that are fitted around the patient’s head. A small, high-frequency electric current (too small to be felt or have any effect) is passed between two of the electrodes, and the voltages between other pairs of electrodes are measured in a process that takes less than one-thousandth of a second.

While we're still a long way away from a practical direct neural interface, this certainly looks like a step in the right direction. They've demonstrated that the measurements are possible, and at a sample rate that is useful. Certainly there's room for improvement in sensitivity, sample rate, and resolution as well as in miniaturization.

When they can reduce this from a trolleycart -sized instrument to something one can support on one's head, then we'll see some more practical and less academic applications. (Yes, like porn. And games. And real virtual reality control of UAVs and waldoes.) Keep in mind that in the 80's, realtime Heads-Up Displays were this large and cumbersome... now look at them.

It really is illuminating to see how little we know about the nature of consciousness and thought, and how far away we still are from technologically-aided introspection.

Re:The device is as newsworthy as the results (4, Informative)

yarnosh (2055818) | more than 2 years ago | (#36486746)

When they can reduce this from a trolleycart -sized instrument to something one can support on one's head, then we'll see some more practical and less academic applications. (Yes, like porn. And games. And real virtual reality control of UAVs and waldoes.) Keep in mind that in the 80's, realtime Heads-Up Displays were this large and cumbersome... now look at them.

Are you, perhaps, confusing reading neural activity with sending specific information into the brain? As far as I can tell, the technology in the article is only for reading neurons activity, not altering them. And even at that, there's no indication that you can extract any real information out of the readings (thoughts, intentions, etc). It is simply an image of activity. I think you're reading WAYYY more into this technology than is there.

Re:The device is as newsworthy as the results (1)

bughunter (10093) | more than 2 years ago | (#36486776)

No, not confusing -- just extrapolating.

A lot, admittedly... like I said, it's a step in the right direction. A baby step.

Re:The device is as newsworthy as the results (1)

ColdWetDog (752185) | more than 2 years ago | (#36486814)

It appears to be an interesting experimental system because you can correlate structural changes with functional ones non invasively and seemingly relatively inexpensively (from the description in TFA the electronics seemed pretty straightforward). As with the 'other' popular way of looking at brain structure / function (fMRI [wikipedia.org] ) we have a long way to go before it is terribly 'useful' but any decent non invasive technology for understanding brain function is good thing.

Mwahahahaha....

Re:The device is as newsworthy as the results (0)

Anonymous Coward | more than 3 years ago | (#36488158)

You think this is a step? You should take a look at memristors, graphene, and artificially constructed neural networks! Of course, structural microbiological scans by nanoautonomous swarm robotics are sure to follow.

Re:The device is as newsworthy as the results (0)

Anonymous Coward | more than 3 years ago | (#36490156)

A baby step.

A quantum leap.

Re:The device is as newsworthy as the results (1)

Omestes (471991) | more than 3 years ago | (#36491302)

A quantum leap.

So even smaller?

Re:The device is as newsworthy as the results (2)

Dachannien (617929) | more than 2 years ago | (#36487092)

And even at that, there's no indication that you can extract any real information out of the readings (thoughts, intentions, etc).

Classifier technology is already advanced enough that making this jump shouldn't be too difficult. The real limitations are (a) the amount of data to be processed and (b) the resolution of the sensors.

Re:The device is as newsworthy as the results (1)

ceoyoyo (59147) | more than 2 years ago | (#36487406)

the resolution of the sensors.

Classification isn't the problem, and never really has been. It's VERY unlikely that you can get anything like the resolution you'd need for any sort of useful brain reading from surface sensors and even if you're willing to undergo invasive surgery to implant electrodes you'd almost certainly need more than are practical to implant at the moment, placement is also a problem, and you STILL might not get enough spatial resolution.

For SF style brain interfaces we're still lacking a workable means to read and stimulate neurons.

Re:The device is as newsworthy as the results (1)

yarnosh (2055818) | more than 3 years ago | (#36488972)

The real limitations are (a) the amount of data to be processed and (b) the resolution of the sensors.

Oh gee, is that all? That's a bit like saying the only thing stopping us from simulating a human brain inside a computer is processing power. Nevermind the enormous lack of basic understanding about how a thought is actually processed. Just throw a bunch of processing power at it and everything will just magically work itself out. /rollseyes

Re:The device is as newsworthy as the results (0)

Anonymous Coward | more than 3 years ago | (#36489056)

Are you, perhaps, confusing reading neural activity with sending specific information into the brain? As far as I can tell, the technology in the article is only for reading neurons activity, not altering them.

That's what they want you to think!

*dons tinfoil hat*

Re:The device is as newsworthy as the results (0)

Anonymous Coward | more than 2 years ago | (#36486930)

When they can reduce this from a trolleycart -sized instrument to something one can support on one's head, then we'll see some more practical and less academic applications.

I tip my tinfoil hat, Sir!

Re:The device is as newsworthy as the results (1)

airfoobar (1853132) | more than 2 years ago | (#36487040)

like porn. And games. And real virtual reality control of UAVs and waldoes

And the TSA.

Prior art (1)

srussia (884021) | more than 2 years ago | (#36486842)

"Method to increase of inhibitory assemblies across the brain's cortex using an imaging technique"

aka 'C-SPAN'

Defensive Implant (1)

Anonymous Coward | more than 2 years ago | (#36486846)

In the first Knights of the Old Republic game, there was an implant item whose flavor text read that it kept over-stimulation from overloading sensory brain parts, and causing damage or unconsciousness. This would be great for soldiers getting hit with IEDs, or whatnot.

Re:Defensive Implant (2)

TheLink (130905) | more than 3 years ago | (#36492212)

Oh yeah, I'm sure the main problem with an IED is overloading sensory brain parts, and that soldiers would be quite happy to not be unconscious after getting hit with an IED and losing a limb or three. They'd rather be able to soberly savour the reality of their situation and the consequences. So most would gladly give an arm and leg for such an implant.

Bose Einstein Condensate? (1)

Pino Grigio (2232472) | more than 2 years ago | (#36486862)

I would be interested to know what constitutes a "neural assembly". I suspect than some form of coherence is involved. The question is whether or not this would be quantum coherence. This would be very difficult to establish of course, just as it was to establish quantum coherence in photosynthesis.

Re:Bose Einstein Condensate? (1)

smitty777 (1612557) | more than 2 years ago | (#36487130)

I think you're reading a little too much in to this - they are basically talking about groups or networks of neurons. Not sure how quantum coherence [wikipedia.org] would play a role in this.

Re:Bose Einstein Condensate? (0)

Anonymous Coward | more than 3 years ago | (#36490860)

Don't you know, since quantum mechanics is mysterious and unintuitive, it can be used to explain any mental phenomenon we don't understand?

Just ask Deepshit Chopra.

Re:Bose Einstein Condensate? (1)

FooAtWFU (699187) | more than 3 years ago | (#36491548)

At a guess, he's probably trying to integrate the whole "quantum" thing into consciousness because he either thinks it's what permits free will (or else because he thinks it has some other magical properties resulting from mankind's mystical unity with the universe - but I'll assume he's not that dumb). The thing is, as explained by The Hammock Physicist (who runs a decent blog), "quantum free will" theories are lame and don't actually get you the sufficient conditions for very much free will. [science20.com] Fortunately, he goes on to explain, you don't need them.

Re:Bose Einstein Condensate? (1)

Pino Grigio (2232472) | more than 3 years ago | (#36495964)

It's not really anything to do with free-will. It's more to do with the fact that I don't believe purely functional explanations of conscious experience, particularly the unitary aspect of it, can be adequately described with current theories. Quantum effects offer a route towards an understanding. Even with quantum effects there is still an explanatory gap; it's just that it is narrowed slightly.

Explaining Unconsciousness? (3, Insightful)

sgage (109086) | more than 2 years ago | (#36486868)

This is absurd. For a start, we don't have clue one about how to explain consciousness. Secondly, recording physical correlates to unconsciousness is not an explanation. Like so much of this stuff, it is description masquerading as explanation. Not bad as a start, perhaps, but don't call it "explanation".

Re:Explaining Unconsciousness? (1)

Intrepid imaginaut (1970940) | more than 2 years ago | (#36486928)

It must be the LSD. /walterbishop

Re:Explaining Unconsciousness? (1)

toastar (573882) | more than 2 years ago | (#36487398)

I'd love to see brain scans of someone on LSD

Re:Explaining Unconsciousness? (1)

Fnordulicious (85996) | more than 3 years ago | (#36489868)

They basically look exactly the same. CT, MRI, none of them have any resolution to show you anything interesting in that context.

Re:Explaining Unconsciousness? (0)

Anonymous Coward | more than 3 years ago | (#36490634)

MEG can see that.
the head is transparent to magnetic fields so the spatial resolution is much better than EEG

You're implicitly assuming dualism (3, Informative)

Fred Ferrigno (122319) | more than 2 years ago | (#36487106)

You're upset that the researchers don't also assume that consciousness is some other kind of thing beyond material investigation. The researchers have no need for that assumption unless and until the evidence leads them there.

Re:You're implicitly assuming dualism (0)

Anonymous Coward | more than 3 years ago | (#36489390)

No, he's just reminding everyone how the ever popular "correlation is not causation" applies here.

Re:You're implicitly assuming dualism (1)

major_fault (1384069) | more than 3 years ago | (#36493576)

It sounds more like he is upset, because of wording making it sound as if consciousness is close to being fully explained by medical science and because of assumption that consciousness is just a physical thing. Although his being upset might a be misplaced feeling, since there seem to be two different definitions for what it means to research consciousness. Maybe I am wrong, but it seems to me that the two are not the same thing:

1. One is used by medical researchers and means exploring the self-referential reasoning capabilities of existing biological neural networks.
2. The other is used by philosophers and means exploring the reason behind personal experience that answers to the questions: "Why am I me? Why do I observe what I observe? Is my consciousness just an illusion created within deterministic world? What will happen when the memory will be gone and the system called brain will cease to perform it's duty and is no longer capable of reasoning?" And so on...

The material investigation of consciousness is problematic, because as far as I know, we are even incapable of determining existence of consciousness in anybody other than self. There could easily exist consciousness in non-linear time that is actually the same for all people, dogs and birds. Or there could be any number of consciousnesses that need to correspond to material configurations. Or maybe everything has consciousness including trees, books and stones.

All that does not mean that the path to understanding the second definition of consciousness might not lie with medical research of brain. Maybe consciousness is a thing that can be explored with completely material investigation. It would be really interesting to know. So good luck with their research.

Re:You're implicitly assuming dualism (1)

SoupIsGoodFood_42 (521389) | more than 3 years ago | (#36529102)

Why assume a material existence, for that matter?

Re:You're implicitly assuming dualism (0)

Anonymous Coward | more than 3 years ago | (#36551722)

Is that a pun?

Re:Explaining Unconsciousness? (2)

smitty777 (1612557) | more than 2 years ago | (#36487116)

You're right, we don't have a clue how consciousness works. But this article is about the process of becoming unconscious, which is vastly different. It is significant that we can now associate precise physical components with the act of becoming unconscious. Since we are the ones putting the person under, we know that it's at least a partial explanation, which is something we didn't know before.

Re:Explaining Unconsciousness? (0)

ceoyoyo (59147) | more than 2 years ago | (#36487426)

Chill. We can explain consciousness as well as most people can explain how their cars work - it seems to arise from activity in the cortex. We even have a decent idea of the parts of the cortex that are most important. From this article it sounds like there are neurons that act to inhibit that activity and so inhibit consciousness. That does indeed explain unconsciousness. You don't need to know all about transistors and etching integrated circuits to explain why a computer that got dropped from a second story window doesn't work.

No, we don't know all the gory details yet, and we probably won't ever be able to satisfy the hard core philosophical navel gazers.

I beg to differ... (0)

rmdyer (267137) | more than 2 years ago | (#36487608)

I beg to differ greatly with your assumption that "we don't have clue one about how to explain consciousness".

In fact we do know a few things...

You are never "conscious" all the time.
Every millisecond or so people are unconscious.
When you sleep you are also unconscious.
There can be many health related problems that would lead you to being unconscious.

Conscious machine properties:

      Has minimums related to spacial volume, and computational capacity for human level "consciousness".
      Has sensory input.
      Processes sensory input.
      Must have a set time-slice unit for processing new information (you will be unconscious during processing).
      Reduces and stores important processed sensory information (must have memory).
      Compares current sensory input with previous sensory input (negative and positive feedback).
      Can generate associative differential information based on the comparison of current to previous information (and stores that for future feedback).
      Can make decisions based on meta level associative differential information.
      Can interact with its surroundings based on decisions.
      Can process events over time, therefore reasoning that events occur over time.
      Requires a semi-stable, but not quiescent, environment from which to operate in.
      (Both environmental deprivation, and complete randomness for input would lead to a non-functioning machine.)
      Probably requires extended downtime to further process daily high level associative differential (and hash) information for higher level reasoning. (Sleep)
      Requires information loss. Must necessarily "forget" non-important (useless) information. It is impossible to store every sensory event.
      Time slice processing can increase or decrease rate based on emergency life protection need.
      (If your life is in danger, adrenaline will increase the rate of processing, and events will seem to "slow down".)
      (Consequently when you age, your sensory processing rate naturally slows, so events in your life seem to "go by faster".)

So the machine must have memory for consciousness, otherwise it is just pure sensory "awareness" without knowledge of itself within and apart from the world.

You are a biochemical machine. All humans are biochemical machines which are "self aware" because of the above properties.

Based on the above properties, it should be possible to build an analog amplifier (op amp) that is trainable to obtain the voltage outputs we desire based on the discrete voltage level inputs. This amplifier would not necessarily be "conscious" but would be the core of something that one day could become "self-aware", and then to "consciousness".

Some philosophical wanderings...

Suppose you awoke the next morning in someone elses body with someone elses memory. Would you then not be "that" person?

The ultimate question is, what makes "you", you?

If you could make a complete copy of yourself using say, a Star Trek like transporter, then what would be the difference between you, and the other "you"? Only one thing... location, which is a position in space-time from which to have and generate separate associative differential information.

It is impossible to have two conscious beings occupy the same space-time location.

Is the "self aware" you that is "you" the same "self aware" me that is "me"? Coo coo ca choo.

%whoami

Re:I beg to differ... (1)

dreemernj (859414) | more than 3 years ago | (#36488248)

Being asleep does not mean being unconscious. If you can be awoken by sound, light, or movement you weren't unconscious.

Re:I beg to differ... (1)

Mia'cova (691309) | more than 3 years ago | (#36490070)

You are unconscious while sleeping. If you can't be woken, that's called a coma. At least, it's a coma after some period of time. I'm not familiar with the exact technical meaning of the word. For example, being anesthetized is obviously not a coma. Let me simplify and put it another way. If you're awake, you're conscious. If you're not awake, you're unconscious. While sleeping, you are not awake, therefor you are unconscious.

Re:I beg to differ... (0)

Anonymous Coward | more than 3 years ago | (#36492772)

Being asleep does not mean being unconscious. If you can be awoken by sound, light, or movement you weren't unconscious.

I could see an argument that dreaming is a form of consciousness, but otherwise yes, sleep is not conscious.

Actually, there's a name for normal consciousness during sleep - it's called "lucid dreaming."

Re:I beg to differ... (0)

Anonymous Coward | more than 3 years ago | (#36489976)

>> Requires information loss. Must necessarily "forget" non-important (useless) information. It is impossible to store every sensory event.

wrong, if you are talking storage capacity, compression is used in the form of correlates/parity.

beg all you want, won't make you right (0)

Anonymous Coward | more than 3 years ago | (#36492702)

first, you need a better grasp of ethology...

1,2, and 3, just no. simple explanation? to build. verb-- an action that ADDS data to 1,2.3 and other points you make, which are really just different words for the same thing...

4, 5, and 6 are tendencies, not a condition of. info is proc'd when awake, as you will see by reading this and learning. wait did I say learning? memory is a function of learning, NOT a prerequisite for consciousness, otherwise babies are not conscious, neither are amnesiacs... learning IS arguably a condition of consciousness... negative and positive is a very oversimplified idea, used in behaviouralsm--(skinner's work was more focused on ethology, even though many read psychology into his work and became frightened)

7,8, and 9, all defined by learning, meta is not applicable, since there is no limit to that loop, ie all meta data can be objectively analysed and new ideas can be drawn from this. interact with surroundings? see build and no need for sensory input examples (1-3)

10,11 and 12... time is not measured by our brain, nor are pixels seen in our vision, in spite of a ~4k x 4k celled retina. our brains fill in these gaps, just like + 20hz for sound. opposite of stable environment is needed, since it is made to proc changes. sense deprivation/or "white room", leads to "random data" ie hallucinations. not enough information to determine this as functionality or not.

13,14 and 15 are, while interesting, philosophical, not scientific. sleep is unknown at this time. 14 is an assumption on basic subconscious theory. 15 is not always true, a many animals sacrifice life to protect procreation, so self preservation is not an element of consciousness, but a product of it, or instinct.

even if all of those conditions you outlined were true, and you could create them satisfactorily in a machine-bio or otherwise, it would still not be conscious. objectivity, self awareness (very elusive)
learning, even more so, would not be there.

not that its a bad thing, but you already wandered philosophically. you are on the right track. a few things to remember. 1 given time learning is infinite. if there is no limit on resources, ie time, and matter, bio-mech needed to learn, then we can't see a limit on learning either..

instant copy of self is a good concept. the difference is existential, ie at that instance, it's a a spacetime reference, coupled with a different physical makeup, 1 because all matter even if it is infinite is unique, and 2 physiological development will be different instantly, as well as mental ie different data, stream of consciousness etc..

spacetime location s a very fuzzy thing, time implies relative velocity, and given this is subject to uncertainty principle. we have not yet defined consciousness as matter or energy, so there is not enough data to say if it can or cannot occupy any space at all. simple example? we don't shrink or lose weight when we die, but our body does stop expressing consciousness to outside observers...

also conjoined twins at the head/brain could lead to some interesting observations about consciousness, ie overlapping, occupying space, shared memory, thoughts/learning etc.. none have survived to offer any learning into the matter

Re:Explaining Unconsciousness? (0, Insightful)

Anonymous Coward | more than 2 years ago | (#36487656)

It's funny how people who are way too full of themselves, always say "we", when they mean "I".

Conciousness is a word used for simple and stupid artifact of a concept we humans think we need, to make us feel special.
There, I said it.

There were times, where words like this were used to differentiate humans from other animals, or even from other humans. Or when we thought we were the center of the universe.

In reality, there is no thing that that concept could describe. If only because it's that vague. And we like it that way.

Our brains are just pattern/similarity detectors with the purpose of predicting the future, to gain an advantage over other blobs trying to do the same.

There's nothing special there. Humans are not special. You... are not special.
Just let it go.

Re:Explaining Unconsciousness? (1)

tgv (254536) | more than 3 years ago | (#36489672)

You're right: how the hell do they define unconsciousness without defining consciousness? And then they define it as the process that happens when you get anaesthesized? That's all about adding drugs to suppress brain functions. No wonder they find something like this.

Apart from the methodological errors that plague such studies. 3D imaging from EEG is at least 10 years old now, and relies on assumptions about the electrical structure of the places where the signal originates, and can only see part of the brain. Then the question becomes: how do you interpret all these different 3D high speed measurements? Apparently, they just found a qualitative difference while looking at the images. That's limited.

Anyway, that doesn't mean this technique/technology doesn't have value (e.g. in anaesthesizing), just that the press blurb claims way too much...

Re:Explaining Unconsciousness? (0)

Anonymous Coward | more than 3 years ago | (#36490836)

We've got an idea of what the different parts of the brain do (from working with stroke, cancer and brain injury victims who have had parts of the brain disabled or removed). We know from MRI scanning (diffusion tensor analysis) the alignment of the nerve pathways. The closest computer model equivalent would be data-flow processing. You've got input passing from sensors (eyes, ears, tongue, skin, nose, joints), passing through long chains of processing filters that each do some particular task (for vision, you have edge enhancement, shape recognition, stereoscopic depth conversion, texture matching). Some of this information goes straight into the hypothalamus to drive emotions and "fight or flight" response.

Then you have a motor neuron control system to allow a person to move about and maintain their sense of balance. This actually forms a feedback system that is closely modeled by robotics. While every part of the brain have neurons with input synapses and an output synapse, there was one region of the brain that didn't have any output synapses, but instead just interconnected with adjacent neurons, and were described as "spindle neurons". When this area of the brain was damaged, the patient never regained consciousness.

To have someone go unconscious temporarily, you have to slow down the brain, disable the motor control system (to stop them acting out their dreams physically). When these go wrong, you get people having narcolepsy attacks, going into comas or sleepwalking.

Sounds like a lie detector (0)

Anonymous Coward | more than 2 years ago | (#36486914)

Portable. Can tell what the brain is doing (i.e. either remembering or creating). Nah, no one would ever use it for that...

Yes, the Cat Has My Tongue (1)

AmberBlackCat (829689) | more than 2 years ago | (#36487070)

I'm having trouble seeing what's so exciting about explaining unconsciousness. Explaining consciousness would be exciting. I realize understanding what makes a person unconscious might help to understand what makes a person conscious. But not in this case. If they're just saying the presence of these inhibitors makes a person unconscious, then we're no closer to understanding consciousness. Because you can't just make an unconscious object become conscious by taking away these inhibitors. And you have no insight into bringing consciousness to something that never had it, regardless of whether these inhibitors were present. Maybe it could somehow help find a treatment for comas or something. Maybe those people are overloaded with these inhibitors. I don't know.

Re:Yes, the Cat Has My Tongue (3, Insightful)

SydShamino (547793) | more than 2 years ago | (#36487110)

Well if neural inhibitors (which interfere with the processing of certain parts of our neural network) cause us to lose consciousness, then one could hypothesize that those parts of our neural network must play a role in consciousness. And that makes us at least a little closer to understanding it.

Re:Yes, the Cat Has My Tongue (0)

AmberBlackCat (829689) | more than 2 years ago | (#36487120)

I mean, a blow to the head will also cause us to lose consciousness. But it won't help us understand what makes us conscious.

Re:Yes, the Cat Has My Tongue (1)

phantomfive (622387) | more than 2 years ago | (#36487158)

Yes, but do you know why a blow to the head makes you unconscious? Apparently it's these neural inhibitors. The better you understand the brain, the closer you are to understanding consciousness. This is one step closer.

Re:Yes, the Cat Has My Tongue (3, Insightful)

TrekkieGod (627867) | more than 2 years ago | (#36487344)

I mean, a blow to the head will also cause us to lose consciousness. But it won't help us understand what makes us conscious.

Actually, it does. It tells us that the organ responsible for consciousness resides in the head. Similarly, we've discovered a lot about what different regions of the brain are responsible for by looking at people who received brain damage to different areas and looking at what they were now unable to do as a result. You know the brain is responsible for consciousness, this can help narrow down what brain activity is involved by looking at what activity is inhibited when you're unconscious.

Re:Yes, the Cat Has My Tongue (1)

Mia'cova (691309) | more than 3 years ago | (#36490078)

Being on the receiving end of a solid kick to the junk can knock you out too :)

Re:Yes, the Cat Has My Tongue (1)

DMUTPeregrine (612791) | more than 3 years ago | (#36507294)

Thus, in males, the organ responsible for consciousness resides in the heads.

Re:Yes, the Cat Has My Tongue (1)

smitty777 (1612557) | more than 2 years ago | (#36487156)

I guess it really depends on how you define "consciousness". If by consciousness you mean the ability to think, perceive, and reason, than it's relatively easy. However, it sounds as if you're trying to equate consciousness with life (e.g., how do you make a toaster conscious). Well, if you add a microprocessor to a toaster, does it become conscious?

Re:Yes, the Cat Has My Tongue (1)

Christoph (17845) | more than 3 years ago | (#36489084)

The distinction is between consciousness (or awareness) versus "conscious awareness", which is the awareness that one IS conscious.

My garage door opener has an electric eye that makes it "aware" whether anything is blocking the path of the closing door. It is not aware that it is aware of this. I, on the other hand, am both aware if the path is blocked, and I am AWARE that I am aware of it.

A toaster with a microprocessor could be called "aware" of specific info, but it's now aware that it's aware of it.

Conscious awareness is akin to being "sentient", in that it's immoral/illegal to brutalize a sentient being. You can brutalize your toaster, but not a person or animal. We take it on faith that others have conscious awareness...for all I know, I'm the only person who is AWARE of his consciousness, and everyone else is a biological robot that's not actually sentient.

I don't think biology can explain conscious awareness. We can't even prove it exists, despite everyone having the direct, personal experience of it.

Re:Yes, the Cat Has My Tongue (1)

smitty777 (1612557) | more than 3 years ago | (#36490404)

The distinction is between consciousness (or awareness) versus "conscious awareness", which is the awareness that one IS conscious.

Actually, what you're describing is called metacognition [wikipedia.org] by us cognitive scientists. I was trying to make the point that the GP seemed to be confusing the two.

Re:Yes, the Cat Has My Tongue (1)

140Mandak262Jamuna (970587) | more than 3 years ago | (#36491236)

. You can brutalize your toaster, but not a ...

I am a toaster, you insensitive clod!, and when I am done toasting, I also help in the drive in counter.

Re:Yes, the Cat Has My Tongue (2)

ceoyoyo (59147) | more than 2 years ago | (#36487440)

Well, to start with, the actual mechanism by which general anaesthetic causes unconsciousness is unknown. It's interesting to know how something we use so much actually works. And it might be interesting to know whether it works by the same mechanism that falling asleep naturally does.

Re:Yes, the Cat Has My Tongue (2)

sjames (1099) | more than 3 years ago | (#36490744)

For starters, it might allow us to develop safer anesthetics and a more effective measure of the depth of anesthesia during surgery so that patients can be given just enough without risk of awareness during the procedure.

The current devices for that are known to have a significant margin for error.

Meh. (0)

PPH (736903) | more than 2 years ago | (#36487102)

A Breathalyzer explains it just as well.

Universal Consciousness? (0)

Anonymous Coward | more than 2 years ago | (#36487518)

Interesting that consciouness is suppressed by inhibitory neurons- So consciousness is supressed by the brain, not expressed?

What a silly spelling (1)

Shin-LaC (1333529) | more than 2 years ago | (#36487688)

It looks like they tried to type "Feiter" as if they were normal people naming a thing, but they forgot their caps lock on.

Re:What a silly spelling (1)

Tim C (15259) | more than 3 years ago | (#36500744)

It's named on the same lines as an fMRI [wikipedia.org] .

The movie (1)

Tibe (444675) | more than 2 years ago | (#36487698)

3-D movie shows what happens in the brain as it loses consciousness...

http://www.youtube.com/watch?v=MzX7w2-FWAA [youtube.com] (24 seconds)

Re:The movie (0)

Anonymous Coward | more than 3 years ago | (#36490680)

Thank you sir!

Critical Section? (0)

Anonymous Coward | more than 3 years ago | (#36490028)

So conciousness is like a critical section on multi threaded code? :)

alternate applications (0)

Anonymous Coward | more than 3 years ago | (#36491388)

I'm curious what the results of using this device to monitor REM sleep or meditation would produce.

Re:alternate applications (0)

Anonymous Coward | more than 3 years ago | (#36491452)

they are both periods of mild awareness during unconsciousness depending on how deep you get into it.

fEITER vs. EEG/MEG tomography (0)

Anonymous Coward | more than 3 years ago | (#36498748)

How does fEITER compare with EEG/MEG tomography in spatial and temporal resolution? It seems to me (disclaimer IINANS) these latter methods can more or less measure the same phenomena that fEITER can. Morover, EEG/MEG tomography are truly passive/non-invasivve, while fEITER involves passing a high-frequency current thru the brain, which may affect the phenomenon itself thats being measured.

What am I missing? Can a neuroimaging expert here please explain the relative strengths and weaknesses of EEG/MEG tomography vs. fEITER?

Thanks
Joyous Blur Jockey

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>