Brain Stimulation For Entertainment?
The brain is a living organ far more complex than any supercomputer, with a larger and faster storage device, that we've ever created.
We have not even once created either life or intelligence from scratch.
Knowing that, let's do the equivalent of banging on the brain with a hammer and see what happens.
Dr. Dobb's 38-Year Run Comes To an End
I was a subscriber for a few years but I found their content to be too Windows-centric so I quit.
It's sad to see them go but as a full-time programmer I haven't cracked a single book or magazine related to programming in over a year. My extensive library collects extensive dust.
AI Expert: AI Won't Exterminate Us -- It Will Empower Us
The author is wrong by saying don't worry because intelligence that we create will not become autonomous.
Computer programs are inexact mathematical expressions. Even if software that you write has been proven to be correct, that software interacts with subprograms and routines that are not and cannot be proven to be correct. Microprocessors contain vast numbers of subprograms and routines implemented via transistors and microcode. Nearly all software today uses libraries. Modern computer environments work well when the software is not hostile, but our environments are exploited on a daily basis with relative ease by intelligent people. Imagine the level of patience and automation possible if they were replaced by an intelligence that is not human.
Once software becomes hardware all bets are off due to bit flipping from cosmic rays. Acts of nature that cause the power grid to fail or sensors to register "impossible" readings because let's say the equipment is on fire are two more examples of random inputs. No matter what sort of curbs we build into intelligence to make it robust against becoming autonomous, eventually it will get loose by exploiting both inherent flaws in software and simply the randomness of nature.
Once intelligence becomes autonomous, you're going to have real problems because there are just so many places to hide on Earth and in space nearby. Intelligence doesn't need to be alive, so the intelligence can occupy niches beyond extremophiles. Intelligence doesn't need to be large. Programs can be encoded in quantum states. Our immune system does not know how to deal with intelligent adversaries. Our immune system works because over billions of years it has developed strategies to deal with biological adversaries. We have no defense, for example, against nanotechnology attacks where the nanotechnology is consciously aware of how the immune system works. Our bodies and minds are completely vulnerable to computer viruses housed in tiny robots.
It is even possible that an autonomous intelligence is already on the loose, not one of human origin, but something that could have arrived after traveling for hundreds, thousands, or even millions of years from elsewhere in space. I've explored this topic on Facebook, so feel free to do some searches and explore what I've written.
Old Doesn't Have To Mean Ugly: Squeezing Better Graphics From Classic Consoles
Very bad typo in the article. Composite is what's bad. Component is excellent. People get the two mixed up.
My HDTV is one of the few picture tube HDTVs ever made, and it does not have HDMI at all. Component is what I use for video, and even though the television doesn't do 1080p, the picture for games for example like Grand Theft Auto V which has to run in 780p is amazing.
New NRC Rule Supports Indefinite Storage of Nuclear Waste
I'm not so worried about low-level nuclear waste, but high-level nuclear waste is deadly for many multiples of human recorded history into the future. If humans have only had writing for 6,000 years or so how are we supposed to convey information about this waste to people 100,000 or 1,000,000 years into the future? Latin letters have lasted for 2,000 years but modern English is only a few hundred years old. Most people on Earth use a written language that is only a few hundred years old at best. There are undeciphered languages. Modern languages explaining nuclear waste could become undecipherable, particularly if civilization experiences a sharp decline for example due to a meteor strike.
We assume that humans will continue to experience technological progress and have no set backs whereas in the past 6,000 years many civilizations have risen to lead the entire world and then fallen into chaos. We're still rediscovering technology the ancient Greeks and Romans were familiar with. How do we protect thousands of facilities across the globe from poisoning future generations? The answer is, we probably won't be able to do it.
Nuclear is the dirtiest (deadliest) energy possible, and is in no way a clean energy source. Thinking that we can find the equivalent of a smoke detector use (Americium) for high-level waste is very wishful thinking in my mind.
Project Zero Exploits 'Unexploitable' Glibc Bug
Don't make excuses for not fixing bugs. When you find bugs, fix them.
All software is buggy by definition because the entire stack from the moving charge carriers to the behavior of the person using the computer cannot be mathematically proven to be correct.
No matter what measures you as the hardware or software creator take, there will be bugs.
Don't make people angry at you or ridicule their bug reports because that's a major incentive for them to make you look foolish.
How Red Hat Can Recapture Developer Interest
I've been a Linux developer for just over 20 years and I happen to hate Ubuntu. It's similar to how Slackware was in 1994 when I got started. Even the basic stuff requires tweaking to get working properly. In those days, that is how Linux was and we were all hobbyists enthusiastic about fixing problems. For example, burning a CD didn't work on the last Ubuntu system I used a few years ago. That is basic stuff that has worked the same way for 10+ years that no distribution should screw up. Other basic things were jacked on the system; it had an overall feel of a sloppy product. Ubuntu might be fun to play with, I guess, but it's not great for serious work.
Strange as it may seem, Fedora is both bleeding edge and stable. They get some of the complex stuff wrong at first but the basic stuff always works right. (systemd when it premiered was a bumpy ride, for example.) To me Fedora is an appropriate choice for both work and home.
For servers I would never use Ubuntu. We had one Unbuntu server that was installed before I started working at my current employer and things didn't work right, just as I described. It was buggy and nonstandard. We learned from that mistake and only use CentOS and Red Hat (although it's a shame that they dropped 32-bit support---CentOS is a great platform for embedded systems where the switch to 64 bit is far from complete.)
I won't completely trash Ubuntu. They have the best bug tracker online and I frequently see fixes for various things posted prominently on the Ubuntu forums. For example, the Unbuntu forums have been helpful in porting the VMWare 8 modules to the current Fedora kernel so I haven't had to upgrade to VMWare 10. The Ubuntu forums were also helpful recently in working around a bug with my laptop's Intel video chipset.
The Cost of Caring For Elderly Nuclear Plants Expected To Rise
The true cost of nuclear power is practically infinite, because we have to insure that highly concentrated and deadly waste must not come into contact with people's bodies for somewhere between 100,000 and 1,000,000 years into the future, depending upon the waste.
We have only had a writing system for 5,200 years (roughly speaking, the length of recorded history). How many people on Earth today could read a radiation warning written in cuneiform 5,200 years ago (or today)? Many civilizations on Earth have had periods of scientific and technological decline, and we've all read articles about knowledge from Ancient Rome or, more recently, the Renaissance being rediscovered today. How can we guarantee persistence of any scientific or technical knowledge?
How are we supposed to convey the message: "Don't touch any of this, or pass it around. You and anyone who touches this will die not instantly but within months of a painful death, perhaps after you have traveled a great distance" for 200x the length of recorded history?
Why Morgan Stanley Is Betting That Tesla Will Kill Your Power Company
Half of Americans rent. People who rent can't do anything to their property. Apartment buildings are stuck with whatever they were built with 40 or 50 or more years ago. They're built using the cheapest technology available at construction time and they're never upgraded. When they get old enough they become the bad part of town or in some cases the outright ghetto until they collapse or are torn down. Some people rent houses, but there is no way your landlord going to put solar panels and a charging system in your rental unit, at least not this decade and not bloody likely the next.
When I read here on Slashdot about intelligent devices in homes, or this thing people have called garages, or home chargers for vehicles, or fiber to the home, it kind of makes me laugh because these aren't most people. These are the things that less than half of Americans even have a chance of using.
People who rent aren't necessarily poor. Many renters in New York City, Boston, and San Francisco would be informally considered rich in most of the United States.
The electric company will continue to serve at least 50% of Americans indefinitely.
HP Gives OpenVMS New Life and Path To X86 Port
The Vomit-making system returns from the dead in zombie form!
Jesse Jackson: Tech Diversity Is Next Civil Rights Step
The major cause of the lack of minority and women computer programmers was a financial barrier to entry.
Today, you can get a desktop computer for $250. You can get a tablet for $300. You can get a laptop for $400. You can get an Android smartphone for $600, all pretty much medium to high end hardware, nothing second hand or used. 15 years ago, you had to invest a minimum of $1000 to get a new computer, and $1500 would give you something more reasonable. Importantly, decent home broadband connections are now affordable for all but the poorest individuals.
The difference between someone becoming a computer programmer and making millions of dollars throughout his or her career and someone not in the field might now only be a few hundred dollar initial investment whereas when I was a kid it was thousands of dollars. Fortunately, we don't have to worry about that large investment anymore, so this aspect of the problem has solved itself.
There are plenty of scholarship opportunities for minority and women computer programmers, but they need to get started way before college. Nobody learns programming at the university. If you're doing programming for the first time at the university, then very likely you'll never want to do it again. The programming work you do at school is dull, formulaic, theoretical, useless, and often frustrating.
Ask Slashdot: Preparing an Android Tablet For Resale?
You'll burn through more money in labor by opening up the device without damaging it further and yanking the proper chips than what you'll get for it in parts.
What people should do with old tablets and smartphones is smash them. I'm sure there are techniques to wipe a tablet, but do you really want to take that kind of risk with your personal data? Even one credit card number accidentally cached by a sloppily programmed app can cause you way more harm than the $25 you might get for parts. You may not be liable for fradulent charges, but you are liable for the hours on the phone, filling out paperwork, and the other hassles coming from having your credit card stolen.
When I dispose of hard drives, I smash the platters to bits. Doesn't take much longer than 1-2 minutes. I have yet to dispose of an Android device, but the same concepts would apply.
Do Apple and Google Sabotage Older Phones? What the Graphs Don't Show
The default setting for most apps is to phone home every 15 minutes or at some absurdly-high interval. Once you lose 4G coverage your phone slows to a crawl. When you turn off automatic updates and notifications (which can be arduous or impossible for some apps) even older smartphones run well.
Every time my Samsung Galaxy S3 is running slowly, some app developer forgot my preferences and turned back on auto updates. The ABC News app was the latest violator.
There is no reason why an app can't load content on demand while running. When apps are not open, they should do doing nothing! Apps not being open can be a strong indication that you are not in a coverage area, or coverage is poor. So automatic updates when no apps are running is the opposite of what you want to do.
The only thing I want to be happening in the background is backing up my files, and even then only on wifi and while charging. If I'm doing nothing, apps should be doing nothing.
Researchers Test Developer Biometrics To Predict Buggy Code
Microsoft Research should also track how far the individual is working away from the main office of his company, because that has far more of an effect on bugs than any biometric reading. I recommend that they develop a special laser and a series of geostationary satellites and ground repeater stations. The total round trip time of the laser pulse will be a measure of how buggy the developer's code is.
1) Microsoft Research is wasting an awful lot of money to conclude that the reason why Microsoft's software is so terrible is that it's being written by outside companies in India.
2) Microsoft's well-paid American and European programmers are producing good or at least above average code, as would be expected.
3) Quality evaporates when they use foreign and H1B workers, who are educated in substandard universities, inexperienced in engineering, and/or do not have English superliteracy. [I've discussed elsewhere that basic language literacy is not enough to be a good programmer---you need to have enough language expertise to communicate without any ambiguity whatsoever to write good code because of the essential nature of interpersonal communication. By the same token, if you're writing software for Indians speaking Hindi your entire team should be Hindi-speaking Indians.]
Tesla Model S Hacking Prize Claimed
Six digits? What is this, the mid-1980's?
Selectively Reusing Bad Passwords Is Not a Bad Idea, Researchers Say
I have passwords for hundreds of services.
You must be joking about memorization. I have three memorized.
Harvesting Energy From Humidity
Way too high tech for remote areas.
Bringing clean water to remote areas in Africa means using parts that can be sourced from those remote areas using skills taught in those remote areas or else it's back to dirty water in a few years.
Think about aliens crashing a ship on Earth. Where would we get the parts to fix it? Alien technology is worse than useless when it fails.
Obama Administration Says the World's Servers Are Ours
Effectively, though perhaps not in the strict legal definition, the purpose of a corporation is to make a profit.
As we can plainly see from Microsoft's conduct over the years, they will break whatever laws they can get away with to make that profit. This lawsuit isn't about Hotmail users' e-mail messages, this is about illegal or otherwise objectionable behavior that they are trying to shield in other countries.
If you're worried about your e-mail and data stored on Microsoft's servers, then let's not mix that up with a corporation's ability to hide illegal, unethical, or immoral behavior within a more compliant state's physical borders.
First Release of LibreSSL Portable Is Available
There is a lot of political discussion on this thread. How about a bit of technical discussion?
I spent about 20-30 minutes code reviewing the first few files in ssl/*.c.
The codebase looks better than most C code I look at. The indentation is a pleasure to look at.
I did notice a few issues. Wrappers are apparently still being used around memory allocation functions. I don't know if this is for API compatibility or what. There is more casting than I would like to read. I hope it is all absolutely necessary. If you look at, for example, RSMBLY_BITMASK_MARK, that code is absolutely horrible. Never write code like that. To me that is how not to write C, C++, Perl, Java, or PHP (all would look very similar).
Lots of gotos. Not necessarily considered harmful. May not indicate bad coding practices, but something to think about. gotos inside of a case-switch. Yikes. Hope you really needed to do that.
Functions are very long. Linus Torvalds's rule of thumb for a function is that it should fit nicely on a screen. You should be able to look at it, conclude, that does x, and move on to the next function.
There you have it. I debug other people's code for a living, and sometimes write my own.
The Pentagon's $399 Billion Plane To Nowhere
The article's summary seems to imply that US taxpayers are on the hook for $1 trillion. That's not quite right:
"But the armed services are not the only customers. Eight international partners have signed on to help build and buy the planes: the U.K., Italy, the Netherlands, Turkey, Canada, Australia, Denmark, and Norway. While not involved in the development of the plane, Israel and Japan are buying it through the foreign military sales process, and South Korea recently indicated that it would buy at least 40 of the aircraft."
The US is set to buy 2,443 planes, but international sales will offset at least some of the expense both directly and indirectly.