Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Tethered Animatronics (Score 0) 145

It's interesting what is not shown in the video.
The thing is essentially an 'animatronic' doll with cables.

Power and processing are offloaded elsewhere.
Let's see it carry on a conversation while walking through the park in the rain.

The point I am making is that the complete system is not sitting there, nor can it.

Comment Re:Credential-itis (Score 1) 531

I'm not sure 'temptation' is the position I'd choose. There is a reason why real A.I. is always 10 to 20 years away.

A.I. is competing with an organism that has evolved, over millions of years, to be very efficient at energy utilization. The organic brain needs to be energy efficient because animals may not find food continuously for long periods of time.

We have designed machines, to date, assuming that energy input will always be plentiful and ubiquitously available. If you add up the calories necessary to make a computer that could perform all the simultaneous tasks that a society of human beings carry out you end up with a massive power supply. A power supply too big to carry.

The amount of energy required to have real, self portable, A.I.; A.I. at the same level as human intelligence; using machinery is impossible from the perspective of energy. If you consider 'social' robots, free of wires, self guided, and behaviourally/physically autonomous, then you're talking about tons of equipment and power supply. To have a society of these things would consume more power than we could produce.

In short, the A.I. singularity with current technology would either be trapped in a network forever, or would be so large and consumptive that it would die right away from lack of energy if disconnected from the grid.

Don't just take it from me as fact. Work out how much energy a single purpose A.I. uses in watts per day, and then multiply it up based on how much more a single human being can do simultaneously by comparison. A human burns about 2400 calories on a good day. That's about ten thousand Watt seconds of energy. A desktop computer; 200 Watts per hour. Super computers are using hundreds of Tesla GPU cards at 225 Watts each to create what we currently call simple A.I..

I have no worries of true autonomous A.I. being anything other than another 20 years away for a very long time.

Comment Re:Credential-itis (Score 1) 531

I agree. What you are noticing is that, in technology, new ideas generally come from the edges of the standard distribution. Academia serves the center of the technical distribution by providing information that is known to those to could do research. In general most research is located on the sloping edges of the distribution. A.I., as a technology, is still a fringe science populated by early adopters and the curious, just as assembly programming would have been 40 years ago. A.I. will only be considered to be mature once it moves toward the mean of programming activity. The downside of that is in the decline of the actual usefulness of human programmers, since A.I. will then optimize itself.

Comment Credential-itis (Score 3, Insightful) 531

To a certain extent this can be true. Our society, however, also suffers from anther problem at the other end of this scale.

This is what I refer to as 'Academic Credentialitis'. This disease is pervasive in our society and needs to be stamped out.

There is no certainty that anyone achieving academic standing in any subject actually makes them good enough at that subject to be 'fail-safe' .
There is a systemic myth that somehow links academic standing with actual skill.

Another question is in the context of the design of academic programs. Programs can only be developed reactively based on social context. This means that any new technology that may be disruptive can only have curriculum developed once it is known. If we look at the top 20 historic developments of technology, that have shaped human history in a disruptive way, the majority of those were non-credential-ed developments.

Consider if you will the rise of the desktop computer. It was **NOT** a degreed professional who designed the first broadly successful consumer P.C..
(Wozniak only finished his engineering degree in 1986.) Even If we only consider the commercial success of the PC from an academic perspective, there were no business academics who predicted or persued the development of a consumer personal computer until after it had already arrived on the business scene from a garage. So, in the context of the computing world that we now live in, academia had little to do with the early development and adoption of the PC technology except to claim it after the fact. In short, if we had all adhered to academic credentials as the basis for the development of this technology, none of us would have it right now. We would all still be using tele-type and reading paper newspapers delivered by hand.

The academic myth has been created as a socio-economic filter to ensure that only those with suitable amounts of cash may achieve status in industry or government. This does not scale well to either skill or aptitude.

It has been suggested that aptitude testing would be a better way to validate skill level rather than degrees. The question is, "who designs the test?". There would be a strong bias to load the content of tests with useless information that only a degreed academic would know in just the same way as requests for proposals are biased toward favoured contractors.

The credential is a problem, not a solution. We need to remove our social addiction to that particular social snake oil and get back to skills assessment instead.

Comment False Analogy (Score 3, Insightful) 531

First of all it's a programming language not a saw.

Secondly, almost all other languages are compiled using a 'C' compiler.
If the 'C' language were a flawed language then producing code for all those other languages, using 'C' would make all of those languages inherently contain the same systemic flaws.

'C/C++' gets a bad rap from programmers because most programmers lack the skills necessary to make reusable patterns or program securely in any language let alone 'C'.

A better analogy would be that programming has become a lot like carpentry. All manner of people claim to be carpenters and joiners just because they own a hammer. In computing, all manner of people claim to be programmers because they own a computer, have a Comp. 150 or 250 course, became a Microsoft Certified Engineer (what ever that means), or downloaded a free compiler and read a manual once. Such is our culture.

It takes a great deal of experience to understand where problems can be produced in any programming language. Unfortunately the under-informed masses of under skilled programmers tend to be negative about the technologies they understand the least.

The industry needs job entrance tests to demonstrate efficacy in programming rather then simply accepting that people are 'qualified' because they dicked with code for 20 hours in high school.

Submission + - Tesla Autopilot Not to Blame in German Collision (reuters.com)

stoicio writes: The collision with a bus was apparently caused by the bus swerving into the oncoming lane.

Quoting the article: Tesla denied that Autopilot was at fault, saying the bus swerved into the car's lane and side-swiped the Tesla, making a collision "unavoidable," the spokeswoman said.
"We can only do so much to prevent an accident," she said, adding that Tesla was in contact with German police.

Meanwhile there were about 3000 collisions of various other makes of automobile in the U.S. today (as usual). Those manufacturers will not be investigating.

It begs the question, "Is it a self driving car problem, or are we all so habitually NOT following the rules, when cars do they get whacked?"

Comment Regulation of Drones Nothing to do With Safety (Score 1) 239

It's about *Who* gets to see what.

There is no statistical surge of horrible quadcopter maimings.
This is just governments and corporations stopping the internet before it happens again.

The internet allows the surfs to openly communicate and spread the word without government control. This has been bad news for countries where governments used to rely on simple propaganda to keep the masses in line. We are now suffering the last generation of people to can't stand that the communication genie has been let out of the bottle. They are fighting tooth an nail to maintain 'conservative values', which means to say 'protect their own misbegotten position in society'.

So now these drones come along. Not only can we spread the word, we can also provide pictures of what happened.
The control is all about only allowing corporations and government to have drones.

It's anticompetitive pandering.

Comment Declare and Sieze Assets: A new Economic Policy (Score 1) 166

So, according to this ruling, all the U.S. needs to do to control global industry is declare businesses crimes. Then seize business assets whether there is an official finding of guilt or not. Keep in mind that this has been to actually been to trial yet. Under U.S. law the defendant s innocent until proven guilty by trial.

Any country doing business with the U.S. should be wary of this type of anticompetitive, legalistic, economic activity. If this can happen to Kim Dotcom it can also happen to Sony, Toyota, Samsung, etc..

If global corporations just sit silently and let this happen, even if they think the Dotcom is a sleaze, they are allowing an international precedent that will set the groundwork for further international asset seizures. All that is needed is an accusation.

Dotcom should take this to the Hague.

Comment Drones are a Miltary Improvement in Air Combat (Score 1) 280

It's not about capability. It never was.

Each 'drone pilot' can theoretically have numerous additional aircraft in reserve if one is destroyed.
Military drones will become substantially less expensive to manufacture than manned aircraft in a very short period of time.

The insurance costs and long term support costs for personnel directly engaged in combat, not mention prior flight training costs in a cockpit, are eliminated by the use of drones.

'Brittle connection'? Nobody cares about that. The main focus is on total cost and strategic return. It always has been. That's why military forces all over the world moved ahead from muskets, swords, and horse back riding to automatic weapons and long range tactics.

'Full awareness' drones are more likely, where each drone will have multiple personnel monitoring individual systems. This essentially gives each drone the abilities of 10 to 100 pilots with 'eyes on' without the risk of losing them in combat and near instantaneous deployment turn around of those staff if a craft is lost. Having the seat outside the aircraft is not a problem since we can now have a very small and fast aircraft with 100 or more people on board (virtually).

You can't do that with manned aircraft. The on board personnel take up too much room, consume too much fuel, and are a long term insurance liability.
The same will happen to ground combat forces in a very short period of time. One small robot that can be easily replaced with a room full of 'full awareness' operators.

This also means the era of companies selling manned field weapons systems, airborne or otherwise, is quickly drawing to a close. Product focus will shift to remotely tethered systems. In a couple decades anything with a human operator will be considered obsolete.

Slashdot Top Deals

If A = B and B = C, then A = C, except where void or prohibited by law. -- Roy Santoro