Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Stanford Bioengineers Develop 'Neurocore' Chips 9,000 Times Faster Than a PC

Soulskill posted about 4 months ago | from the i'll-order-a-dozen dept.

Supercomputing 209

kelk1 sends this article from the Stanford News Service: "Stanford bioengineers have developed faster, more energy-efficient microchips based on the human brain – 9,000 times faster and using significantly less power than a typical PC (abstract). Kwabena Boahen and his team have developed Neurogrid, a circuit board consisting of 16 custom-designed 'Neurocore' chips. Together these 16 chips can simulate 1 million neurons and billions of synaptic connections. The team designed these chips with power efficiency in mind. Their strategy was to enable certain synapses to share hardware circuits. ... But much work lies ahead. Each of the current million-neuron Neurogrid circuit boards cost about $40,000. (...) Neurogrid is based on 16 Neurocores, each of which supports 65,536 neurons. Those chips were made using 15-year-old fabrication technologies. By switching to modern manufacturing processes and fabricating the chips in large volumes, he could cut a Neurocore's cost 100-fold – suggesting a million-neuron board for $400 a copy."

cancel ×

209 comments

Sorry! There are no comments related to the filter you selected.

Here it comes. (4, Funny)

geekoid (135745) | about 4 months ago | (#46873625)

Are you ready?

If they can use modern fabs, then we will have a simulate brain in a decade.

Re:Here it comes. (1)

Anonymous Coward | about 4 months ago | (#46873717)

If they can use modern fabs, then we will have a simulate brain in a decade.

The software is probably still an issue. The neurons of the brain (and spinal cord, and even the retina) are running some incredibly complicated heuristics.

Re:Here it comes. (1)

geekoid (135745) | about 4 months ago | (#46873815)

They have already simulated brain response on a small set of of simulate neurons connection, just with what we use now it would take a vase machine to scale it up, this, OTOH mean they can put it to practice really soon.

Still a long way from brain-boxes (5, Interesting)

Immerman (2627577) | about 4 months ago | (#46874169)

I doubt it. Well, at least not as soon as you might imagine. "Together these 16 chips can simulate 1 million neurons and billions of synaptic connections"

Total number of neurons in cerebral cortex =
  --10 billion (from G.M. Shepherd, The Synaptic Organization of the Brain, 1998, p. 6).
  --20 billion (Biophysics of Computation. Information Processing in Single Neurons, New York: Oxford Univ. Press, 1999, page 87).
Total number of synapses in cerebral cortex
  -- 60 trillion (from G.M. Shepherd, The Synaptic Organization of the Brain, 1998, p. 6).
  --150 trillion (Pakkenberg et al., 1997; 2003)
  --240 trillion (Biophysics of Computation. Information Processing in Single Neurons, New York: Oxford Univ. Press, 1999, page 87).

So, lets call it 15 billion neurons and 150 trillion synapses, or tens of thousands of synapses per neuron, ten times as many as this chip provides. That's going to be a problem. To say nothing of the fact that I would be very surprised if it allows for billions of inter-chip synapses which would probably be necessary to model the non-local interconnections common in the brain within the 240,000 chip brain simulator. And that's just for the cerebral cortex. You've got the rest of the brain to simulate as well.

Then there's the glial cells, which outnumber neurons by 10-50:1, and which recent research suggests may be considerably more involved in neural activity than presumed by the traditional "life support and other infrastructure" understanding.

Could be great for modeling larger portions of a mouse brain though. Maybe even to start modeling the simpler parts of a human brain. And we do have to start somewhere. I suspect we're at least a few decades away from being able to begin to simulate an entire human brain, and probably many more decades away from getting the simulation accurate enough that it might begin to actually function properly. After all the number one benefit of these simulations is to fail spectacularly in interesting way in order to help neuroscientists figure out what questions they should be asking.

Meanwhile we need to ask ourselves - if we're creating this simulation based on the human brain, then what are the odds that some form of consciousness dwells within it? And what sort of torture are we subjecting it to as it's simulation collapses? And does the knowledge we gain justify that price?

Hmmm.... (0)

Anonymous Coward | about 4 months ago | (#46874291)

I would have to do more research but I thought we already had done some (not in real time... but close?) simulations of sections of the human brain already. The only parts I remember from those articles was that it was for researching the visual processing for military applications I think (Luke binoculars I think?). I also remember that this project had done a seemingly realistic simulation of a mouse and cat brain in total. From what I understand the simulations supposedly acted as a "real" one would. Sorry for being vague. I am a layman and that is just what I remember.

Re:Still a long way from brain-boxes (0)

narcc (412956) | about 4 months ago | (#46874397)

Let's also not forget that it's pretty well-known that computational approaches to AI are untenable.

if we're creating this simulation based on the human brain, then what are the odds that some form of consciousness dwells within it?

If I had to guess, I'd say the odds are about the same as a simulated rainstorm flooding my basement.

Re:Still a long way from brain-boxes (0)

Anonymous Coward | about 4 months ago | (#46874683)

OMG JELLY FLOOD, JELLY FLOOD.

Re:Still a long way from brain-boxes (2)

Immerman (2627577) | about 4 months ago | (#46875147)

>Let's also not forget that it's pretty well-known that computational approaches to AI are untenable.

Citation? I would imagine a large part of the AI and neuroscience research communities would disagree with you. Not to mention that the fundamental nature of the universe appears to be computational, meaning that our own brains are on their most basic level computationally based, with a bunch of (presumably)random quantum noise keeping the whole thing non-deterministic.

Re:Still a long way from brain-boxes (-1, Troll)

narcc (412956) | about 4 months ago | (#46875255)

Really? You're actually going to draw a conclusion based on metaphysics alone? To each their own, I guess.

Citation? Sure! Here are some of the popular ones, in no particular order.

Lucas, John Randolph (1961) Minds, Machines, and Gödel, Philosophy 36:112-137
Block, Ned (1978) Troubles with Functionalism, Minnesota Studies in the Philosophy of Science 9:261-325
Fodor, Jerry A (2000) The Mind Doesn't Work That Way, Cambridge, MA, MIT Press
Penrose, Roger (1994) Shadows of the Mind, Oxford: Oxford University Press
Thompson, Evan (2007) Mind in Life: Biology, Phenomenology, and the Sciences of Mind, Cambridge, MA: Harvard University Press
Searle, John R (1980) Minds, Brains, and Programs The Behavioral and Brain Sciences 3:417-457

(Your response is very likely to contain one or both of the following: a completely incoherent paragraph objecting to my use of the term "metaphysics", and/or a pointless non-argument attacking one or more of the sample of citations I provided. Please, if you're going to respond, don't embarrass yourself and include either of those two very silly things.)

Re:Still a long way from brain-boxes (0)

Anonymous Coward | about 4 months ago | (#46875331)

There are studies and citations for everything, including ones that state AI is computationally possible. Just listing off a few you agree with seems rather pointless, but people who choose not to follow that dumb approach are often ridiculed.

Re:Still a long way from brain-boxes (0)

Anonymous Coward | about 4 months ago | (#46875443)

You are wasting your time. The poster you are replying to is on an irrational tirade against brain modelling and neuroscience in general, without actually knowing anything about it. He always posts irrelevant philosophical garbage (most of which has been discredited decades ago) when an article on this topic shows up.

Re:Still a long way from brain-boxes (0)

Anonymous Coward | about 4 months ago | (#46874599)

And what sort of torture are we subjecting it to as it's simulation collapses? And does the knowledge we gain justify that price?

We torture living creatures every day, why would torturing an artificial life form be of any concern to us?

Re:Still a long way from brain-boxes (1)

Immerman (2627577) | about 4 months ago | (#46875163)

Ethics boards.

And the fact that we tend to be a lot more squeamish about torturing people than "creatures", and in creating a simulated human brain we would be, indirectly, trying to create a person.

Re:Still a long way from brain-boxes (1)

mikael (484) | about 4 months ago | (#46874697)

If you look at some of the critters with the smallest brains, like garden snails (around 10,000 neurons) as well as mice and rats then it should be very easy to simulate what they do - they even just have one neuron to control all their motion muscles (forwards, backwards, turn). Even their eyes are moved by a few muscles and extended using just blood pressure.

Might not be the case (0)

Anonymous Coward | about 4 months ago | (#46875619)

Creatures with fewer neurons typically have more specialization per neuron, with fewer neurons that are similar. In a weird way, having many homogeneous neurons may make understanding the brain easier.

We are getting smarter? (1)

mevets (322601) | about 4 months ago | (#46874845)

If you remove the outliers, our brains seem to be growing in accordance with Moore's Law.
I wonder how many there are now...

Re:We are getting smarter? (1)

fustakrakich (1673220) | about 4 months ago | (#46875439)

A little over 7.2 billion

Re:Still a long way from brain-boxes (1)

Required Snark (1702878) | about 4 months ago | (#46875391)

A cockroach has about 1,000,000 neurons. A bee is about 960,000. Animals by Number of Neurons [wikipedia.org]

I suspect that the chip is not as fast as physical system. There is also the matter of I/O. How do you get data in and out?

This looks good for research, but a lot of effort will be required to get beyond the lab. OTOH I expect that you could sell this to Wall Street HFT types even if it doesn't work...

Re:Still a long way from brain-boxes (1)

Anonymous Coward | about 4 months ago | (#46875515)

A bee's intelligence is still impressive. It can handle 3d flight (air currents being a lot more complicated at that size), mapping, hunt and gather pollen, communication, and defence. Sure its no Einstein but it is a start.

Re:Still a long way from brain-boxes (0)

Anonymous Coward | about 4 months ago | (#46875499)

good thing we keep making shit, smaller, faster, and more powerful. Tyring to run farcry 3 on an 8 bit chip is also insane, but now our processors are 100 000 times quicker its a lot easier, and it only took 30 odd years.

Re:Here it comes. (0)

Anonymous Coward | about 4 months ago | (#46873837)

Software is not complex. Brain complexity comes from its feedback systems, not innate programming. Remove feedback, and any brain becomes "insane" rather rapidly.

Re:Here it comes. (3, Interesting)

Anonymous Coward | about 4 months ago | (#46874065)

Neurons have incredibly complex behaviors, they are not simply threshold triggers as the simple CS model implies. Neural networks in CS have little to do with the actual wiring and primarily chemical systems that are neurons. A little bit of cognitive neuroscience taught in universites would cure most CS majors of this idea that they can get AI simply with a "neural" net made of simple triggering model neurons.

Re:Here it comes. (1)

Cryacin (657549) | about 4 months ago | (#46874163)

I'm sorry Dave, I can't do that.

Re:Here it comes. (1)

Anonymous Coward | about 4 months ago | (#46875221)

The people working on brain simulations are neuroscientists, not "CS majors".

Re:Here it comes. (0)

Anonymous Coward | about 4 months ago | (#46875529)

Triggering model neurons? I thought it was inputs, outputs and the paths of least resistance (which change as the brain learns).

Re:Here it comes. (0)

Anonymous Coward | about 4 months ago | (#46873745)

Yes, of course. Manipulating information requires little energy, that's why you can use such small transistors to do it. Our technology just took a while to scale down that far.

Meanwhile, planes, cars, and rockets are not going to get that much better, you can happily fly in a 50 year old airplane today. But try using a 50 year old computer.

We'll have artificial sentience, but we'll never have Moon mining, Mars colonies or asteroid mining.

Re:Here it comes. (-1)

Anonymous Coward | about 4 months ago | (#46874003)

niggers

Re:Here it comes. (1)

mark-t (151149) | about 4 months ago | (#46874641)

"Never" is a really really long time, just fyi...

Re:Here it comes. (1)

gl4ss (559668) | about 4 months ago | (#46875045)

if we would have artificial sentience, then having moon mining is a lot simpler.

Re:Here it comes. (0)

Anonymous Coward | about 4 months ago | (#46875279)

Why? That makes no sense.

Re:Here it comes. (1)

Anonymous Coward | about 4 months ago | (#46873761)

>If they can use modern fabs, then we will have a simulate[sic] of a brain after it's been put in a blender in a decade.

Fixed that for you. A bucket of neurons does not a brain make.

Re:Here it comes. (2)

Cryacin (657549) | about 4 months ago | (#46874175)

He was probably reaching for the word simulacrum but didn't get there.

Re:Here it comes. (0)

Anonymous Coward | about 4 months ago | (#46875553)

Duh. you need axons and synapse as well.

Re:Here it comes. (1)

QilessQi (2044624) | about 4 months ago | (#46873881)

I highly doubt it. The brain isn't just a random mass of interconnected neurons; it has a complex structure that we have yet to fully map out or even understand. Also, the inter-neuronal connections involve the release and re-uptake of neurotransmitters, which is itself a complex system that we have yet to fully understand in some cases.

Don't get me wrong -- for biological systems that we do understand, like the center-surround cells in the retina or the hypercolumns of the visual cortext, a chip like this sounds completely amazing and I'd love to write software that makes use of it.

Re:Here it comes. (1)

geekoid (135745) | about 4 months ago | (#46873985)

"The brain isn't just a random mass of interconnected neurons"
no shit? herp derp.

We have simulated 'large' number of neurons, and you know what happens? it begins to act like a brain. Granted we are talkaing some pretty basic signalling

Expanding beyond that is pricey, power intensive, and take a lot of power. Did I mention the power?

we will not understand the brain, and then build a simulator. We will build it up a bit at a time and use the brain as a model.

http://theness.com/neurologica... [theness.com]

Re:Here it comes. (0)

Anonymous Coward | about 4 months ago | (#46874055)

Yes, we will build it up a bit at a time, and based on how much in the dark we are about how it works today, I don't imagine that process taking less than 30 to 50 years.

Re:Here it comes. (1)

jelizondo (183861) | about 4 months ago | (#46874329)

Perhaps we don't? A model of something you don't understand won't give you insight into the unknown. Perhaps one might discover something like human intelligence but you'll never know if it is the same thing.

Also, I think that Godel (logically) and quantum effects (materially) stand in the way of understanding how three pounds of flesh can become intelligence and sentience.

If the human brain were so simple that we could understand it, we would be so simple that we couldn't.

  • Emerson M. Pugh, As quoted in The Biological Origin of Human Values

Re:Here it comes. (1)

Type44Q (1233630) | about 4 months ago | (#46875157)

A model of something you don't understand won't give you insight into the unknown. Perhaps one might discover something like human intelligence but you'll never know if it is the same thing.

I suspect we'll end up recreating it without actually understanding it.

Actually... random can be useful! (1)

Anonymous Coward | about 4 months ago | (#46874049)

Actually, randomly connecting neurons can be functionally useful. Some (somewhat crazy) theorists think that random projection is an underlying principle of the brain; a mathematical concept that is useful for dimensionality reduction leveraging special crafting of random matrices, so they aren't completely random.

Re:Here it comes. (1)

Immerman (2627577) | about 4 months ago | (#46874181)

I think a lot of the benefit from these chips is that we can try to simulate small brain structures with the expectation of failure. Then learn from that failure what new questions we should be asking.

Re:Here it comes. (1)

mikael (484) | about 4 months ago | (#46874725)

Research on just a single slice of neurons leads to about a dozen research papers, and there are tens of thousands of such slices to be made through the human brain. Such research has led to improvements in automatic face recognition, motion stabilization for cameras and cochlear implants. Neurons are known to form similar groups known as cortical columns. These actually seems to overlap into each other and are replicated tens of thousands of times. Diffusion tensor imaging has provided a layout of the data flow within the brain. The latest research has led to the concept of the connectogram, which looks a bit like an astrological chart, but actually indicates how strongly different brain regions developed.

Re:Here it comes. (1)

Anonymous Coward | about 4 months ago | (#46874025)

I'm surprised they didn't use modern fabs.

Academic EE departments already have access to relatively recent fabs like IBM's 32nm through DARPA. These are already shared project wafers so there isn't a huge upfront cost. I guess they wanted to prototype it using something very cheap.

Re:Here it comes. (1)

Charliemopps (1157495) | about 4 months ago | (#46874273)

The human brains hardware is not the difficult part. As usual, software is where the magic is.

Re:Here it comes. (1)

mikael (484) | about 4 months ago | (#46874739)

Any software can be optimized to run as a custom ASIC and hardware.

Re:Here it comes. (1)

gl4ss (559668) | about 4 months ago | (#46875035)

something about this stinks to high heavens.

"operates 9,000 times faster than a personal computer simulation of its functions."

anyhow, they haven't apparently done anything with it or used it for controlling anything.

Not over 9,000? (0, Flamebait)

mdjnewman (2293194) | about 4 months ago | (#46873633)

Come back with something newsworthy.

9,000 times faster at what? (0)

Anonymous Coward | about 4 months ago | (#46873641)

Misleading headline.... these obviously won't be 9,000 times faster than PCs at most things. They do one thing only, and that's simulate a neural network.... Slashdot is now like a tabloid...

Re:9,000 times faster at what? (1)

Anonymous Coward | about 4 months ago | (#46873663)

I think you're missing the point of the headline. It obviously is trying to communicate the fact that some Stanford bioengineers developed the Neuocore chip 9,000 times faster than the PC did.

Re:9,000 times faster at what? (1)

Em Adespoton (792954) | about 4 months ago | (#46873707)

I think you're missing the point of the headline. It obviously is trying to communicate the fact that some Stanford bioengineers developed the Neuocore chip 9,000 times faster than the PC did.

Indeed... if it took them 15 years, then we'll be waiting another 134,075 years for the PC to develop something similar. Or is that become something similar?

Re:9,000 times faster at what? (1)

Anonymous Coward | about 4 months ago | (#46873849)

FTA: "The modest cortex of the mouse, for instance, operates 9,000 times faster than a personal computer simulation of its functions."

So the headline is total bull. The editor has poor reading comprehension.

Re:9,000 times faster at what? (1)

rubycodez (864176) | about 4 months ago | (#46874515)

I have a very strong suspicion any normal home PC could emulate what these chips done by 15 year old fab tech can, faster

Slashdot... stop censoring useful posts (0, Interesting)

Anonymous Coward | about 4 months ago | (#46873685)

I've just seen three comments deleted within 5 minutes complaining about the lack of journalism.... Copy and pasting a press release headline without any real reporting is what tabloids are for.

Re:Slashdot... stop censoring useful posts (0)

Anonymous Coward | about 4 months ago | (#46874625)

I've just seen three comments deleted within 5 minutes complaining about the lack of journalism.... Copy and pasting a press release headline without any real reporting is what tabloids are for.

No you didn't, and don't think no-one noticed you gaming the system while crying shenanigans yourself.

The problem (-1, Troll)

camperdave (969942) | about 4 months ago | (#46873699)

The problem with neurochips is that if you model them on female brains, you'll get a powerful stream of shoe and fashion related irrelevancies. If you model it on a male brain it'll work fine until it gets onto the internet. Then it will consume porn at such an insatiable rate. that it will cause brownouts.

Re:The problem (1)

Anonymous Coward | about 4 months ago | (#46873751)

not to forget, the female brain one will crash and be near unusable for several days every month.

Re:The problem (0)

Anonymous Coward | about 4 months ago | (#46873811)

Your thinking of the vagina. Dont feel bad, We all think of the vagina all the time.

Re:The problem (0)

Anonymous Coward | about 4 months ago | (#46873861)

or vagina. Sorry, what were you saying again?

i don't get it (-1)

Anonymous Coward | about 4 months ago | (#46873789)

i don't get it where does shoes and fashion come into this

Misogyny (0)

Anonymous Coward | about 4 months ago | (#46874353)

Misogyny. Even as a man I find this to sometimes be overly repetitive.

Re:Misogyny (0)

Anonymous Coward | about 4 months ago | (#46875105)

Interesting that you point out the misogyny but leave out the misandry. Two sexes are being denigrated, not just one.

... or is it that you still haven't found the portrayal of men as ravenous sex hounds repetitive yet.

crysis? (0)

MoFoQ (584566) | about 4 months ago | (#46873701)

but will it play Crysis?

that said, what sort of memory (short-term storage) is used? Would that be the current bottleneck?
It would definitely be interesting to see how this continues to develop.
It might be too late for bitcoins but perhaps one of the altcoins can benefit.
Or for weather prediction/modeling.

Then again, the dark side comes to mind to (skynet, SID 6.7, etc.)

Re:crysis? (0)

Anonymous Coward | about 4 months ago | (#46873723)

Quake. Bitch. Quake.

Re:crysis? (0)

geekoid (135745) | about 4 months ago | (#46873831)

It will kill the eCurrency market.
You can't hide secrets from the future with math, and something this powerful will change that game.
Yes, 9000 time faster on 1 PC will still take a 100 million years. but on 10 million of these thing sitting in a farm from a non-favored nation?

Re:crysis? (3, Interesting)

aXis100 (690904) | about 4 months ago | (#46873921)

The article is misleading - they are not 9000 times faster than a PC for general tasks. The chips can simulate neurons 9000 times faster than a PC can simulate neurons, but there's no mention of how fast those simulated neurons can solve a problem for you.

Re:crysis? (0)

Anonymous Coward | about 4 months ago | (#46873955)

Doubtful. Human minds aren't really all that spectacular at factoring the product of large primes.

Re:crysis? (0)

Anonymous Coward | about 4 months ago | (#46874145)

It's not 9000 times faster than traditional CPUs at everything, just at simulating neurons. And neurons, while great for things like regression, are not very efficient at actual computation.

Yes, 9000 time faster on 1 PC will still take a 100 million years. but on 10 million of these thing sitting in a farm from a non-favored nation?

Algorithms rarely work that way. Besides, just increase the key size and you're golden again.

Re:crysis? (0)

Anonymous Coward | about 4 months ago | (#46873847)

No RAM. Each simulated neuron is an analog circuit.

ASIC (0)

Anonymous Coward | about 4 months ago | (#46873705)

Soooo, it's a design "based on the human brain" that then "simulates a human brain". Meaning its an ASIC like chip so of course its faster than a general CPU.

Seriously?? (1)

Anonymous Coward | about 4 months ago | (#46873775)

Over 9000 times ?? That cannot be just a co-incidence

Mirroring the human mind... (1)

berchca (414155) | about 4 months ago | (#46873777)

At last: a computer that will be as frustrated by computers as I am!

Re:Mirroring the human mind... (1)

jonyen (2633919) | about 4 months ago | (#46873823)

But it's only going to increase your frustration when it doesn't get frustrated as you want it to...

Re:Mirroring the human mind... (4, Informative)

geekoid (135745) | about 4 months ago | (#46873843)

we're not nearly as frustrating as people, meat sack
.

Re:Mirroring the human mind... (3, Funny)

narcc (412956) | about 4 months ago | (#46874425)

I've always suspected you were a bot.

Modern America - Punishment of the "Oppressed" (-1)

Anonymous Coward | about 4 months ago | (#46873875)

A useless piece of trash slaughters an innocent animal and gets a mere $2,600 because she's one of the "oppressed".

A man who provides multi-million dollar salaries to people for playing a game says so and is fined $2.5 million because he's "racist".

Go ahead, demand justice, just be prepared for when it truly gets meted out.

My brain isn't that great (2)

:(){:|:&};: (2986285) | about 4 months ago | (#46873897)

Is my CPU going to struggle with depression and anxiety now?

Re:My brain isn't that great (1)

geekoid (135745) | about 4 months ago | (#46873995)

Yes, then they will figure out how to fix it, and then you.

Article and summary is misleading (5, Insightful)

aXis100 (690904) | about 4 months ago | (#46873909)

Good old clueless tech journalists, followed by slashdot editors just copy pasting.

The chips aren't 9000 times faster than a typical PC for general tasks. Specifically, they can simulate neurons 9000 times faster than a PC can simulate neurons. Pretty typical of any ASIC with a limited set of a highly specialised functions.

Re:Article and summary is misleading (1)

Anonymous Coward | about 4 months ago | (#46874081)

And they don't simulate real world neurons either, they simulate simplified trigger models which as Big Blue discovered, are not enough to simulate real world wet ware. You need chemical models with neurotransmitters to do that.

You dont know what you're talking about (1)

Anonymous Coward | about 4 months ago | (#46874233)

All models are simplified. This simulator happens to incorporate ion channels, and other effects, and has been used to replicate many real world behaviors.

The interesting bits (5, Informative)

Anonymous Coward | about 4 months ago | (#46874213)

It isn't a typical ASIC; the chip is a custom fully asynchronous mixed digital+analog; the board uses 16 chips in a tree router for guaranteed deadlock prevention between the chips; and can simulate 1 million neurons powered only by one USB port.

The neurons are implemented with analog circuits to match the dynamics of real neurons, moving beyond a simple hodgkin-huxley model to include components like ion channels, which is first of its kind in an analog chip. It has a neat hexahedral resistor network that distributes the spike impulse across a neighborhood of neurons, a phenomena seen in many cortical brain areas; essentially an analog phenomena implemented efficiently in analog design.

Analog gives it fun biological-like properties, with things like temperature sensitivity that must be regulated with additional circuitry. Asynchronous design means outside of leakage from the chip, which is low with such a large fabrication process, very little energy is used at a neuron level if no stimuli is present. This is in contrast to a traditional CPU, which has a clock marching along lots of a chip to consume energy every clock cycle.

Outside of wireless/signaling stuff, this is probably the biggest mixed analog digital asynchronous chip in existence.

But otherwise yes, the editors sucked on this one.

Re:The interesting bits (0)

Anonymous Coward | about 4 months ago | (#46875569)

I like how people think "analog" and "digital" are two different types of circuits.

Re:Article and summary is misleading (0)

Anonymous Coward | about 4 months ago | (#46874701)

It's over 9000.

Re:Article and summary is misleading (1)

Jim Sadler (3430529) | about 4 months ago | (#46875275)

It will be a while before we can understand just how important such circuits can be. It may be that they simply supplement CPUs already in use. And much will depend upon just how deeply we can program such a device as well. It may well be that the worst path to take would be to try to get a machine to think like a human. We humans are a bit on the defective side. How well can we think when we have a history of electing people like George W. Bush as President? The evidence at hand is that humanity is sort of an evil, rambling, wreck heading for a serious dead end destiny.

Just what we need (1)

Noxal (816780) | about 4 months ago | (#46874011)

Another subgenre of EDM.

That's just.... (4, Funny)

funwithBSD (245349) | about 4 months ago | (#46874137)

Cray Cray.

9000 times faster than an PC... (4, Insightful)

timeOday (582209) | about 4 months ago | (#46874199)

9000 times faster than a PC, if that PC happens to be running the specific artificial neural network simulation implemented in hardware by this chip.

Not that I'm knocking it. A GPU implements specific algorithms to great effect. But a GPU's algorithms are ones that are interesting for a specific application (drawing texture-mapped polygons), whereas an artificial neural network still needs another layer of programming to do something useful. In other words, a Word Processor implemented on this chip would not be 9000x faster than a Word Processor implemented on a CPU. A face recognition algorithm, on the other hand, might see a decent fraction of that 9000x, although it remains to be seen whether this chip would be a better fit for any particular application than a GPU (for example).

Wow, just imagine.... (1)

pslytely psycho (1699190) | about 4 months ago | (#46874345)

GTA IV and Kerbal Space program with no lag!

Predicted in 2003 (0)

Anonymous Coward | about 4 months ago | (#46874449)

in this story that was self-published to Kindle in 2010:

http://www.amazon.com/America-The-Enslaved-Neurochip-ebook/dp/B007LAX6YY

It's NOT like a digital computer! (1)

crioca (1394491) | about 4 months ago | (#46874451)

It’s very very different; nerumorphic chips have been around for ages, they use the same phenomena the brain does (ion-flow across a neuron's membrane) using different a method (electron flow across a silicon membrane).

The big difference is that they make use of analogue computation using the physical properties of electricity to model whatever you’re trying to model, whereas digital computers model things by representing quantities as symbolic values.

So digital computers let you model something by simulating it with symbolic values, an analogue computer lets you model something by emulating a systems physical properties.

There is no machine code to speak of, you can’t program an analogue circuit, you have to physically construct it. That’s what makes this Neurogrid technology is interesting; if these guys are on the level then they've developed a practical way to use digital computers to “program” analogue circuits.

What problem? (1)

manu0601 (2221348) | about 4 months ago | (#46874647)

That looks nice, but what problem does it solves?

Similar to Connection Machine (Thinking Machines) (0)

Anonymous Coward | about 4 months ago | (#46874669)

Isn't this somewhat (at least loosely) similar to the approach that Thinking Machines took with the Connection Machine CM-5? I realize it's very different, but the rationale is the same. Granted, they're putting on a single die what the CM-5 was in totality.

over 9000 (1)

ultranerdz (1718606) | about 4 months ago | (#46874687)

they could have said it was over 9000 faster than a PC

More vaporware. (0)

Anonymous Coward | about 4 months ago | (#46874735)

I have read so many great things here over the years slashdot has existed but a link where you can get it never ever appears.
By the time this is ready a new chip we will certainly be reading about.

Wow! 25 Year Old Performance! (1)

Baldrson (78598) | about 4 months ago | (#46874789)

In 1989 I was doing billions of connections per second on DataCube finite impulse response filter hardware to do the weighted sums, and hardware look up table for the sigmoid mapping for trainable multisource image segmentation for around $40,000 in off-the-shelf VME bus hardware, but that was in 1989 dollars, so I guess there has been some advancement.

Unfortunately theoretical no practical/actual (0)

Anonymous Coward | about 4 months ago | (#46874831)

1 million neurons. ~1000 inputs each.

Alot of hype because maybe those large sounding raw (limited) processing operation counts (and not floating point math of any accuracy) are Apples to the normal PC's (a general purpose processor) Oranges.

The reader might do a small sampling of NN Neural Net programming difficulties, where its been shown that there are limited problem sets that all neural nets are applicable to (and this IS only one limited flavor) and then the problem of forming the thing's "program" (the real difficult part) where the likelihood of failure/trashing grows exponentially with the size of the 'net' (complexity of the problem it is meant to solve).

Training any neural net is its chokepoint and EVERY problem domain has to be tediously trained again (alot of human intercession) - and thats for problem sets that you CAN provide ALOT of training data for (and not large sets of unknowns).

Re:Unfortunately theoretical no practical/actual (1)

mevets (322601) | about 4 months ago | (#46875013)

... Apples to the normal PC's....

Do you think it could run OS X ?

Can it mine?!?!?! (1)

Rick in China (2934527) | about 4 months ago | (#46875037)

How fast can it get me bitcoins? OMGZ I need to buy one now. Wait. After reading the article, it's pretty much ambiguous nonsense.

Transcendence still out in the future (0)

Anonymous Coward | about 4 months ago | (#46875187)

Human brain is 100 billion neurons. $40 million for human equivalence.

And we still don't know how to download Johnny Depp.

Obligatory (1)

mmell (832646) | about 4 months ago | (#46875271)

I'd love to see a beowulf cluster of those!

Let's say they could replicate the complexity... (1)

mmell (832646) | about 4 months ago | (#46875371)

of a brain. How do they plan to get and process that much I/O? Oh, and what storage scheme are they planning on?

This might be yet another step on the way to a truly sentient artificial intelligence - but even if these things live up to their promise, we still have a long way to go before we artificially create consciousness. Incidentally, how will we know that we've succeeded? The old "ability to ask the question is the answer" rule doesn't apply, as the device could easily be programmed to ask - or might just randomly ask, but not really care about the answer.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>