Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Researchers Create First All Optical Nanowire NAND Gate

Soulskill posted about 2 years ago | from the anything-electrons-can-do-photons-can-do-better dept.

Hardware 50

mhore writes "Researchers at the University of Pennsylvania have created the first all optical, nanowire-based NAND gate, which paves the way towards photonic devices that manipulate light to perform computations. From the release: 'The research team began by precisely cutting a gap into a nanowire. They then pumped enough energy into the first nanowire segment that it began to emit laser light from its end and through the gap. Because the researchers started with a single nanowire, the two segment ends were perfectly matched, allowing the second segment to efficiently absorb and transmit the light down its length.' The gate works by shining light on the nanowire structure to turn on and off information transported through the wire. The research appeared this month in Nature Nanotechnology (abstract)."

cancel ×

50 comments

Wrong direction (3, Insightful)

nurb432 (527695) | about 2 years ago | (#41275017)

I think we are wasting the potential of future optics if we think in binary, as this team is doing.

Optics scream for multilevel logic.

Re:Wrong direction (-1)

Teresita (982888) | about 2 years ago | (#41275093)

Either way, I think Moore's Law needs to be revised again. He was a pessimist.

Re:Wrong direction (0)

Anonymous Coward | about 2 years ago | (#41275369)

Moore's law was about the doubling of transistor count, not the increase in performance rating. Photons can better utilize the computational resources of matter (atoms) over electrons. But the count in transistors will stay on target. Just as Moore predicted.

Re:Wrong direction (1)

peragrin (659227) | about 2 years ago | (#41275133)

If we can do binary then we don't need to completely rethink the programming languages as well as test out the hardware.

Re:Wrong direction (1, Interesting)

Teresita (982888) | about 2 years ago | (#41275177)

Betcha didn't know all the computations for the holodeck were done inside the holograms themselves.

Re:Wrong direction (0)

Anonymous Coward | about 2 years ago | (#41280503)

Betcha didn't know the holodeck is made-up technology.

Re:Wrong direction (1)

FatdogHaiku (978357) | about 2 years ago | (#41282839)

Betcha didn't know the holodeck is made-up technology.

Sure does speed up debugging!

Yes, we need to revisit everything. (1)

Anonymous Coward | about 2 years ago | (#41275215)

If we can do binary then we don't need to completely rethink the programming languages as well as test out the hardware.

But that's what we need to do. Current software technology won't measure up to the new hardware technology. As it is, current development languages don't even use the multiprocessor CPUs most efficiently and these same tools are going to be horribly inadequate.

And we will see most of what we know about computer science will be obsolete with this new computational machine.

But that's the way it goes. People have to frame new technology in old paradigms because humans aren't that adaptable. But one day, someone will come around and see what these new machines are truly capable of and re-write computer science.

Re:Yes, we need to revisit everything. (1)

Anonymous Coward | about 2 years ago | (#41275289)

Computer science is not going to be made obsolete by optical logic.

Re:Yes, we need to revisit everything. (2)

nurb432 (527695) | about 2 years ago | (#41275299)

That isn't exactly what he meant. Current computer science will be mostly obsolete, but not the concept of computer science. It will adapt, it has to.

Re:Yes, we need to revisit everything. (0)

Anonymous Coward | about 2 years ago | (#41275425)

Still disagree that any of it will be made obsolete by optical logic. CompSci isn't married to electronics.
Now, in the unpredictable future where anything can happen, maybe we'll all be using quantum computers, and then I would agree there would be a huge shakeup in computational theory.

Binary will go away. (1)

Anonymous Coward | about 2 years ago | (#41275505)

CompSci isn't married to electronics.

Binary math perfectly for switches: relays, then vacuum tubes and then transistors.

Binary will fail miserably in a quantum computational environment..

Operating System theory will be thrown out the door. So will networking.

Datastructures will definitely have to be reworked ...

CS as we know it is a goner; which is wonderful! I wish I can be around for it all. Alas, I'll probably be long gone by the time this technology makes to the stage where it will be useful to CS people.

Re:Binary will go away. (0)

Anonymous Coward | about 2 years ago | (#41277417)

Stop voicing your opinions on subjects you know nothing about.

Binary will fail miserably in a quantum computational environment..

First, binary works perfectly in a quantum computational environment. Depending on where you're storing your cubits, they either have up spin or down spin. Left spin or right spin. Or, since you're talking about optical, horizontal polarization or vertical polarization, etc. Yes, qubits can be also be in a superposition state, but that's not a new state. It's not a 2. It's still a 0 and a 1. At the end of the computation, it will not be in superposition. In fact, by reading the result you assure that it's not in superposition.

Second, quantum computing will not replace classical computing. Quantum computing sucks at a lot of things classical computers are good at. In fact, Shor's algorithm doesn't work, unless you have a classical computer to verify the answer it gives you, so you can figure out when the answer it gives you is correct (and yes, I know a person can multiply without a computer, but a person doing arithmetic by hand is a classical computer.

Operating System theory will be thrown out the door. So will networking.

Datastructures will definitely have to be reworked ...

CS as we know it is a goner

Blah, blah, blah. CS as we know it has been developing algorithms for technology that doesn't exist yet for a long, long time. We're ahead of the curve. By the time the technology is there, we already have the code to run on it.

Re:Binary will go away. (0)

Anonymous Coward | about 2 years ago | (#41278923)

Quantum anything is highly impractical for most things. How can you build a computer if you can't even copy data from one place to another? A quantum computer is really a classical computer with a quantum co-processor.

Re:Yes, we need to revisit everything. (0)

Anonymous Coward | about 2 years ago | (#41275327)

one program PER CORE!!! woohoo

Re:Yes, we need to revisit everything. (2)

mo (2873) | about 2 years ago | (#41275869)

It's not that humans are not adaptable, it's that parallel computing is hard for humans to figure out. Linear execution lends itself to all kinds of easy abstractions: loops, branches, methods, etc. Parallel computing, not so much. Mutexes are awful. The best we've got is message passing and functional programming, but even that is hard to design correctly to be both understandable and exploit inherent parallelism.

Y'know what's even harder to design? Analog computing. Holy cow. Remember, digital computing was invented by Touring before we even had built a computer. It's easy to visualize how it works. My brain explodes though trying to imagine a fuzzy-logic analog equivalent of a touring machine.

I used to think that AI research combined with neuroscience would figure out a simple solution to this problem, but it's increasingly seeming like, no, it's even complicated in the brain.

So people can pine for analog memristor computation, and analog optical computing all they want, but the hardware is the easy part here. Get the software side solved, and if you build it they will come. But it's not because we aren't used to these problems, it's because these problems are really really hard.

Re:Yes, we need to revisit everything. (0)

Anonymous Coward | about 2 years ago | (#41277083)

Parallel programming isn't that bad. Basic OO designs make it quite easy to program and debug. I just use objects to store state and I make sure each object transitions in a very well defined state.

So within a given class, all instance and static methods of the class handle everything that needs to be done for that class. Each class does a fairly specific set of work, but may interact with other classes, but only through the other classes instance or static methods.

I really don't think multi-threading is much harder than serial programming.

Coroutine + AsyncIO + Multithreading = very scalable(assuming few shared resources, which is usually an issue of design)

Sometimes you also realize that approximate is "good enough" and you don't need perfect. You can remove a lot of locking if you realize that some race-conditions really aren't that bad.

Re:Yes, we need to revisit everything. (0)

ByteSlicer (735276) | about 2 years ago | (#41280559)

Y'know what's even harder to design? Analog computing. Holy cow. Remember, digital computing was invented by Touring before we even had built a computer. It's easy to visualize how it works. My brain explodes though trying to imagine a fuzzy-logic analog equivalent of a touring machine.

His name was Alan Turing. Honor him by at least spelling his name correctly...

Re:Yes, we need to revisit everything. (-1)

Anonymous Coward | about 2 years ago | (#41283153)

Why do we need to honour anyone? Your naive hero worship is unproductive and childish. Besides, it's honour, you stupid ass.

Get your own shit together and stop telling people what to do. I don't come to where you work and criticize how you suck dicks, do I?

Re:Yes, we need to revisit everything. (1)

ByteSlicer (735276) | about 2 years ago | (#41287147)

It's not hero worship, it's a matter of respect. Turing made great contributions to several fields related to computers and math. As a reward, he was prosecuted by his own government. So the least we can do is remember his name correctly.

And "honor" is the US spelling of the UK "honour". Your reaction is unproductive, childish and cowardly.

Re:Wrong direction (1)

nurb432 (527695) | about 2 years ago | (#41275293)

Never tried to imply it would be painless or trivial, but sometimes the end results are worth the effort to truly advance to a new level.

Re:Wrong direction (0)

Anonymous Coward | about 2 years ago | (#41275387)

Differing hardware implementations are why compilers exist.

Re:Wrong direction (1)

Anonymous Coward | about 2 years ago | (#41275147)

I think we are wasting the potential of future optics if we think in binary, as this team is doing.

I agree with this statement about 22% (orange).

Re:Wrong direction (1)

Anonymous Coward | about 2 years ago | (#41275189)

You can do that with electric current, too. There's a reason analog computers died out.

Re:Wrong direction (1)

nurb432 (527695) | about 2 years ago | (#41275223)

Yes there was, but it was due to technology/cost not keeping up with the far simpler digital world, not due to an inherent problem with the concept.

I'm also not talking analog, but multilevel logic. There is a difference. You can still have your 'digital' accuracy, but increase your bandwidth several times over.

Re:Wrong direction (0)

Anonymous Coward | about 2 years ago | (#41275497)

This is already done when sending signals across fiber optic cable - many different wavelengths are being used at one time, each carrying a 0/1 signal.

Re:Wrong direction (0)

Anonymous Coward | about 2 years ago | (#41275247)

I disagree. Optical methods tuned to a single wavelength will likely ease gate array construction (by an order of magnitude I would guess), simplify the interface layer to the electronic side, and build on decades of experience with binary logic.

I'm glad they got this far at all.

Re:Wrong direction (1)

Noitatsidem (1701520) | about 2 years ago | (#41275283)

Computers used to be analog, but they screamed for digital logic. We live in a digital era for a reason.

Re:Wrong direction (2)

Kergan (780543) | about 2 years ago | (#41275307)

Not sure what you mean by multilevel logic, but I'd suggest it screams for multiplexed logic. (By this, I mean using the same gates several times at once by multiplexing, who knows, different wavelengths, polarizations or angular moments.)

Re:Wrong direction (0)

Anonymous Coward | about 2 years ago | (#41275371)

Not sure what you mean by multilevel logic, but I'd suggest it screams for multiplexed logic. (By this, I mean using the same gates several times at once by multiplexing, who knows, different wavelengths, polarizations or angular moments.)

Exactly. DWDM more follows this logic, and that exists today.

Re:Wrong direction (1)

Anonymous Coward | about 2 years ago | (#41275365)

> Optics scream for multilevel logic.

Dunno where you are getting that from. Transistors are analogue as well (as used in amplifiers), we just chose to use them binarily (not-a-word) because it's easier to deal with. The Russians tried some stuff with trinary systems (more efficient because they are closer to base-e), but they were abandoned.

Re:Wrong direction (3, Funny)

jovius (974690) | about 2 years ago | (#41275383)

There already are 10 levels.

Re:Wrong direction (5, Insightful)

perl6geek (1867146) | about 2 years ago | (#41275465)

I've worked two years on a PHD thesis involving all-optical signal processing (though I worked on all-optical signal regeneration, not logical gates), and one of my conclusions is that multi-level is an order of magnitude more challenging than two values. The reason is that if you do multiple processing steps, you usually get some random fluctations, so you need to have components that fix that, i.e. fix to a certain level. Now you have basically two options, you can encode your information in the phase or in the amplitude/power. In the case of power levels you can use something like nonlinear loop mirrors, but they have the problem that they change the power ratio level between the states. In the case of phase encoded signals, a you can use a saturated phase-sensitive amplifier (for example two symmetric pumps), but they require quite high powers, and you have to injection-lock the pumps to compensate phase drifts, and they still only work for two levels. There is exactly one scheme that works for multiple levels (see http://eprints.soton.ac.uk/336325/1.hasCoversheetVersion/Thesis.pdf [soton.ac.uk] for a PHD thesis about it), but it turns phase noise into amplitude noise, so you need an amplitude regenator after it. So, binary logic is plenty of challenge to get working; once that's establish, we can still think about multiple levels.

Re:Wrong direction (0)

Anonymous Coward | about 2 years ago | (#41276125)

Posts like this are the reason I still come to slashdot.

Thank you.

Re:Wrong direction (0)

Anonymous Coward | about 2 years ago | (#41277849)

Posts like this are the reason I still come to slashdot.

Thank you.

Hear, hear.

Re:Wrong direction (1)

Anonymous Coward | about 2 years ago | (#41277089)

Now, is that a binary or decimal order of magnitude? :-)

An amazing development (0)

Anonymous Coward | about 2 years ago | (#41275211)

Now we're officially in the future.

NAND? Sounds like an AND gate to me... (0)

Anonymous Coward | about 2 years ago | (#41275229)

When both inputs 'on' mean the output is 'on', it sounds like an AND gate to me...

Re:NAND? Sounds like an AND gate to me... (4, Informative)

Kergan (780543) | about 2 years ago | (#41275255)

Not sure where you read this... Per TFA:

[quote]
A NAND gate, which stands for “not and,” returns a “0” output when all its inputs are “1.”
[/quote]

And the Nature Nanotechnology article's summary says nothing specific.

Re:NAND? Sounds like an AND gate to me... (0)

Anonymous Coward | about 2 years ago | (#41279379)

Also an OR gate.

Re:NAND? Sounds like an AND gate to me... (1)

dissy (172727) | about 2 years ago | (#41279387)

NAND is read as Not-AND

For an AND gate, the output is only on if both inputs are on. The output is off in all other conditions (including both inputs being off)
A NAND gate is the reverse of that, it's output is only OFF when both inputs are on, and the output is on in all other cases.

The most importaint detail about the NAND gate, is that you can built all the other gates from nothing but NAND gates.
NAND and NOR gates are the only two logic gates that share this ability.

This article shows all the base logic gates and how to construct them from nothing but NAND gates
http://en.wikipedia.org/wiki/NAND_logic [wikipedia.org]

If they had only constructed an AND gate, they would still need to construct a NOT gate before being able to build any other gate, and so is not as important of a milestone.

NAND is enough (0)

Anonymous Coward | about 2 years ago | (#41275311)

As I recall from a previous, high-tech life, it is possible to build a computer using only NAND gates. In other words, without AND, OR, and NOR gates.
I'm looking forward to an all-optical computer.

Re:NAND is enough (2)

unixisc (2429386) | about 2 years ago | (#41278067)

As per D'Morgan's Law, it's possible to build a computer using either NAND or NOR gates. But not w/ AND or OR gates, if inverters don't exist.

Been there (2)

frovingslosh (582462) | about 2 years ago | (#41275423)

I remember reading over twenty years ago of an all-optical nand gate. This was pre-web, so it might not be easy to find, but I remember the article. The gate was much much larger, although the developers (of course) said that they expected to be able to shrink it in size to suitable dimensions. And a good part of the article was a prediction of an all optical computer within 5 years. The logic behind this prediction was that all steps in creating the state of technology that we had the were created very incrementally, We used hand labor to create the first ICs. We used those to create powerful computers. We created CAD software for those to develop even smaller computers. But all of those steps had been done and were in place, the prediction was that it should take no more than 5 years to substitute in optical technology for electronic technology into computer design software and start cranking out optical devices.

Not sure what happened to that optical nand gate from over two decades ago. Maybe it just couldn't shrink down. Maybe it was just falsified. Or maybe someone already has optical computers but will not share them with us conspiracy theorists. But I'm jaded now and not so inclined to get excited on yet another "first" announcement.

Re:Been there (0)

Anonymous Coward | about 2 years ago | (#41275529)

its done with carbon nanotube. thats the difference

Density and No True Off (2)

simpz (978228) | about 2 years ago | (#41276459)

The two major disadvantages of all Optical processing in the past were:

1/ The Wavelength of light is much larger than the structures used in modern day chips. So the optical circuity wouldn't be as dense as modern day electronics.

2/ When you turn off an optical signal it gets turned into heat (i.e the transistor goes black) not true in electronics as there is no electrical flow when the transistor is off. This causes a theoretical optical devices run hotter than electronic ones and therefore hurting density yet again.

Optics and optical processing have there place (especially processing communications data) but for high density processing these two problematic are very problematic.

Re:Density and No True Off (3, Insightful)

mark-t (151149) | about 2 years ago | (#41276869)

Photons have an interesting property over electrons, however. A photon's motion does not produce an external field which will affect the trajectory taken by others that passes close to it. Electrons do.

Re:Density and No True Off (0)

Anonymous Coward | about 2 years ago | (#41283629)

Have you heard of Electromagnetic Fields (EMF)?

Re:Density and No True Off (1)

mark-t (151149) | about 2 years ago | (#41284299)

Yup. And photons don't change their direction when they are exposed to one. Electrons do.

Nyan gate (1)

Anonymous Coward | about 2 years ago | (#41279053)

Moore's law will be saved by rainbows and kittens.

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...