×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

Artificial Brain '10 Years Away'

BillyBlaze Re:Seems ethically dodgy... (539 comments)

Science does ignore things outside of the universe, but amazingly enough, everything that matters is, by definition, inside it.

In other words, suppose there is a soul. If we can still make a brain simulator that acts conscious, then it doesn't really matter, because it had no observable effect. If, because humans have souls and computers don't, we can't make a conscious brain simulator, then the soul has an observable effect, and can be reasoned about with science. Now, in the first case, you might say that the brain simulator acts conscious but isn't. It would be a lot like saying people with a different skin color act conscious but aren't, though - not morally defensible.

Religions are not dualist because their ability to reason without evidence has allowed them to see some great truth that science has missed. They're dualist because they were conceived before we came to the great realization that the behavior of living things emerges from the physical laws.

more than 5 years ago
top

Artificial Brain '10 Years Away'

BillyBlaze Re:Seems ethically dodgy... (539 comments)

Why would you be unable to aenesthetize an artificial brain? It's just a chemical that has some (currently not well understood) effect on the physical processes in your brain. If the artificial brain works by simulating those processes, it should be relatively straightforward to simulate those effects, and you should get the same temporary loss of consciousness.

I would say that consciouness is inherently tied to the algorithms that produce it. Those algorithms happen to be executed by a massively parallel self-modifying chaotic biological organ, but, being algorithms, they could in principle be carried out by other hardware. (The strong Church-Turing thesis.) Granted, our crude attempts to design similar algorithms from first principles (Bayesian networks, predicate logic, expert systems, etc.) are so different from what happens in the brain that it's fair to say they are not the same thing. But that's not what these guys are doing - they're not reverse-engineering the software, they're emulating it at a low level.

I suspect the only real barriers are technical - how do you get sufficient information about the structure of the brain, and how it changes over time? How do you learn which aspects of that are important and which can be abstracted? And how do you get it running sufficiently quickly?

more than 5 years ago
top

Elcomsoft Claims WPA/WPA2 Cracking Breakthrough

BillyBlaze Re:You can get hard passwords (349 comments)

Randomly banging on the keyboard clearly produces less than ideal entropy. Case in point, your password contains "asedf", which I'm willing to bet was the result of you drumming the fingers of your left hand. Now, whether it matters for such a long password is another matter, but if you're paranoid enough to use a password like that, you may as well go the extra mile.

more than 6 years ago

Submissions

BillyBlaze hasn't submitted any stories.

Journals

BillyBlaze has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?