We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!
Paul Fernhout writes: An article in the Harvard Business Review by William H. Davidow and Michael S. Malone suggests: "The "Second Economy" (the term used by economist Brian Arthur to describe the portion of the economy where computers transact business only with other computers) is upon us. It is, quite simply, the virtual economy, and one of its main byproducts is the replacement of workers with intelligent machines powered by sophisticated code. ... This is why we will soon be looking at hordes of citizens of zero economic value. Figuring out how to deal with the impacts of this development will be the greatest challenge facing free market economies in this century. ... Ultimately, we need a new, individualized, cultural, approach to the meaning of work and the purpose of life. Otherwise, people will find a solution — human beings always do — but it may not be the one for which we began this technological revolution."
This follows the recent Slashdot discussion of "Economists Say Newest AI Technology Destroys More Jobs Than It Creates" citing a NY Times article and other previous discussions like Humans Need Not Apply. What is most interesting to me about this HBR article is not the article itself so much as the fact that concerns about the economic implications of robotics, AI, and automation are now making it into the Harvard Business Review. These issues have been otherwise discussed by alternative economists for decades, such as in the Triple Revolution Memorandum from 1964 — even as those projections have been slow to play out, with automation's initial effect being more to hold down wages and concentrate wealth rather than to displace most workers. However, they may be reaching the point where these effects have become hard to deny despite going against mainstream theory which assumes infinite demand and broad distribution of purchasing power via wages.
As to possible solutions, there is a mention in the HBR article of using government planning by creating public works like infrastructure investments to help address the issue. There is no mention in the article of expanding the "basic income" of Social Security currently only received by older people in the U.S., expanding the gift economy as represented by GNU/Linux, or improving local subsistence production using, say, 3D printing and gardening robots like Dewey of "Silent Running." So, it seems like the mainstream economics profession is starting to accept the emerging reality of this increasingly urgent issue, but is still struggling to think outside an exchange-oriented box for socioeconomic solutions. A few years ago, I collected dozens of possible good and bad solutions related to this issue. Like Davidow and Malone, I'd agree that the particular mix we end up will be a reflection of our culture. Personally, I feel that if we are heading for a technological "singularity" of some sort, we would be better off improving various aspects of our society first, since our trajectory going out of any singularity may have a lot to do with our trajectory going into it.
395 comments | 12 hours ago
Jason Koebler writes: If and when we finally encounter aliens, they probably won't look like little green men, or spiny insectoids. It's likely they won't be biological creatures at all, but rather, advanced robots that outstrip our intelligence in every conceivable way. Susan Schneider, a professor of philosophy at the University of Connecticut, joins a handful of astronomers, including Seth Shostak, director of NASA's Search for Extraterrestrial Intelligence, NASA Astrobiologist Paul Davies, and Library of Congress Chair in Astrobiology Stephen Dick in espousing the view that the dominant intelligence in the cosmos is probably artificial. In her paper "Alien Minds," written for a forthcoming NASA publication, Schneider describes why alien life forms are likely to be synthetic, and how such creatures might think.
369 comments | yesterday
An anonymous reader writes The Microsoft Band, introduced last month, hosts a slew of amazing sensors, but like so many wearable computing devices, users are unable to access their own data. A Brown University professor decompiles the app, finds that the data is transmitted to the Microsoft "cloud", and explains how to intercept the traffic to retrieve the raw minute-by-minute data captured by the Band.
51 comments | 2 days ago
kwelch007 writes I commonly work in a clean-room (CR.) As such, I commonly need access to my smart-phone for various reasons while inside the CR...but, I commonly keep it in my front pocket INSIDE my clean-suit. Therefore, to get my phone out of my pocket, I have to leave the room, get my phone out of my pocket, and because I have a one track mind, commonly leave it sitting on a table or something in the CR, so I then have to either have someone bring it to me, or suit back up and go get it myself...a real pain. I have been looking in to getting a 'Smart Watch' (I'm preferential to Android, but I know Apple has similar smart-watches.) I would use a smart-watch as a convenient, easy to transport and access method to access basic communications (email alerts, text, weather maps, etc.) The problem I'm finding while researching these devices is, I'm not finding many apps. Sure, they can look like a nice digital watch, but I can spend $10 for that...not the several hundred or whatever to buy a smart-watch. What are some apps I can get? (don't care about platform, don't care if they're free) I just want to know what's the best out there, and what it can do? I couldn't care less about it being a watch...we have these things called clocks all over the place. I need various sorts of data access. I don't care if it has to pair with my smart-phone using Bluetooth or whatever, and it won't have to be a 100% solution...it would be more of a convenience that is worth the several hundred dollars to me. My phone will never be more than 5 feet away, it's just inconvenient to physically access it. Further, I am also a developer...what is the best platform to develop for these wearable devices on, and why? Maybe I could make my own apps? Is it worth waiting for the next generation of smart-watches?
219 comments | 2 days ago
MojoKid writes Les Baugh, a Colorado man who lost both arms in an electrical accident 40 years ago, is looking forward to being able to insert change into a soda machine and retrieving the beverage himself. But thanks to the wonders of science and technology — and Johns Hopkins University Applied Physics Laboratory (APL) — he'll regain some of those functions while making history as the first bilateral shoulder-level amputee to wear and simultaneously control two Modular Prosthetic Limbs (MPLs). "It's a relatively new surgical procedure that reassigns nerves that once controlled the arm and the hand," explained Johns Hopkins Trauma Surgeon Albert Chi, M.D. "By reassigning existing nerves, we can make it possible for people who have had upper-arm amputations to control their prosthetic devices by merely thinking about the action they want to perform."
66 comments | 2 days ago
Esra Erimez writes: Backblaze is transitioning from using 4 TB hard drives to 6 TB hard drives in the Storage Pods they will be deploying over the coming months. With over 10,000 hard drives, the choice of which 6TB hard drive to use is critical. They deployed 45 and tested Western Digital (WD60EFRX) and Seagate (STBD6000100) hard drives into two pods that were identical in design and configuration except for the hard drives used.
172 comments | 3 days ago
HughPickens.com writes: Claire Cain Miller notes at the NY Times that economists long argued that, just as buggy-makers gave way to car factories, technology used to create as many jobs as it destroyed. But now there is deep uncertainty about whether the pattern will continue, as two trends are interacting. First, artificial intelligence has become vastly more sophisticated in a short time, with machines now able to learn, not just follow programmed instructions, and to respond to human language and movement. At the same time, the American work force has gained skills at a slower rate than in the past — and at a slower rate than in many other countries. Self-driving vehicles are an example of the crosscurrents. Autonomous cars could put truck and taxi drivers out of work — or they could enable drivers to be more productive during the time they used to spend driving, which could earn them more money. But for the happier outcome to happen, the drivers would need the skills to do new types of jobs.
When the University of Chicago asked a panel of leading economists about automation, 76 percent agreed that it had not historically decreased employment. But when asked about the more recent past, they were less sanguine. About 33 percent said technology was a central reason that median wages had been stagnant over the past decade, 20 percent said it was not and 29 percent were unsure. Perhaps the most worrisome development is how poorly the job market is already functioning for many workers. More than 16 percent of men between the ages of 25 and 54 are not working, up from 5 percent in the late 1960s; 30 percent of women in this age group are not working, up from 25 percent in the late 1990s. For those who are working, wage growth has been weak, while corporate profits have surged. "We're going to enter a world in which there's more wealth and less need to work," says Erik Brynjolfsson. "That should be good news. But if we just put it on autopilot, there's no guarantee this will work out."
658 comments | 3 days ago
MojoKid writes Recently, Carnival cruise lines gave tours of their CSMART facility in Almere, the Netherlands. This facility is one of a handful in the world that can provide both extensive training and certification on cruise ships as well as a comprehensive simulation of what it's like to command one. Simulating the operation of a Carnival cruise ship is anything but simple. Let's start with a ship that's at least passingly familiar to most people — the RMS Titanic. At roughly 46,000 tons and 882 feet long, she was, briefly, the largest vessel afloat. Compared to a modern cruise ship, however, Titanic was a pipsqueak. As the size and complexity of the ships has grown, the need for complete simulators has grown as well. The C-SMART facility currently sports two full bridge simulators, several partial bridges, and multiple engineering rooms. When the Costa Concordia wrecked off the coast of Italy several years ago, the C-SMART facility was used to simulate the wreck based on the black boxes from the ship itself. When C-SMART moves to its new facilities, it'll pick up an enormous improvement in processing power. The next-gen visual system is going to be powered by104 GeForce Grid systems running banks of GTX 980 GPUs. C-SMART executives claim it will actually substantially reduce their total power consumption thanks to the improved Maxwell GPU. Which solution is currently in place was unclear, but the total number of installed systems is dropping from just over 500 to 100 rackmounted units.
42 comments | 4 days ago
New submitter st1lett0 writes: Now and in years past, electronic engineers and hobbyists alike have enjoyed the classic 1972 April Fool's joke by Signetics of the Signetics 25120 Write-Only Memory chip. Now it seems that the previously anonymous practical joker has identified himself and stepped forward with new information to correct and complete the story.
100 comments | 4 days ago
Molly McHugh writes: What better way to sell telepresence technologies than having the store employees themselves appear via robot? At the Beam store in Palo Alto, Calif., no human salespeople physically appear, only robots. Users appear on the 17-inch display and control the robot via keyboard, mouse, or Xbox controller. Beam can roll as fast as two miles per hour. People behind the screen control the Beam through their computers, and two wide-angle cameras attached to the top of the bot lets them see everything happening around the store. It’s a bit eerie, watching floating heads tool around and talk to people in this video, and the customers’ react to the Beam with confusion and wonder.
52 comments | 5 days ago
szczys writes Obviously the personal computer revolution was world-wide, but the Eastern Bloc countries had a story of PC evolution all their own. Martin Malý tells first hand of his experiences seeing black market imports, locally built clones of popular western machines, and all kinds of home-built equipment. From the article: "The biggest problem was a lack of modern technologies. There were a lot of skilled and clever people in eastern countries, but they had a lot of problems with the elementary technical things. Manufacturing of electronics parts was divided into diverse countries of Comecon – The Council for Mutual Economic Assistance. In reality, it led to an absurd situation: You could buy the eastern copy of Z80 (made in Eastern Germany as U880D), but you couldn’t buy 74LS00 at the same time. Yes, a lot of manufacturers made it, but 'it is out of stock now; try to ask next year.' So 'make a computer' meant 50 percent of electronics skills and 50 percent of unofficial social network and knowledge like 'I know a guy who knows a guy and his neighbor works in a factory, where they maybe have a material for PCBs' at those times."
115 comments | 5 days ago
itwbennett writes According to a report in Korean IT Times, Samsung Electronics has begun production of the A9 processor, the next generation ARM-based CPU for iPhone and iPad. Korea IT Times says Samsung has production lines capable of FinFET process production (a cutting-edge design for semiconductors that many other manufacturers, including AMD, IBM and TSMC, are adopting) in Austin, Texas and Giheung, Korea, but production is only taking place in Austin. Samsung invested $3.9 billion in that plant specifically to make chips for Apple. So now Apple can say its CPU is "Made in America."
114 comments | 5 days ago
Forbes contributor Jason Evangelho has nothing good to say about a recent Windows 7 patch that's causing a range of trouble for some users. He writes: If you have Windows 7 set to automatically update every Tuesday, it may be to permanently disable that feature. Microsoft has just confirmed that a recent update — specifically KB 3004394 — is causing a range of serious problems and recommends removing it. The first issue that caught my attention, via AMD’s Robert Hallock, is that KB 3004394 blocks the installation or update of graphics drivers such as AMD’s new Catalyst Omega. Nvidia users are also reporting difficulty installing GeForce drivers, though I can’t confirm this personally as my machines are all Windows 8.1. Hallock recommended manually uninstalling the update, advice now echoed officially by Microsoft. More troubles are detailed in the article; on the upside, Microsoft has released a fix.
229 comments | about a week ago
mikejuk (1801200) writes "When British astronaut Tim Peake heads off to the International Space Station in November, 2015, he will be accompanied on his 6-month mission by two augmented Raspberry Pis, aka Astro Pis. The Astro Pi board is a Raspberry Pi HAT (short for Hardware Attached on Top), and provides a gyroscope, accelerometer, and magnetometer, as well as sensors for temperature, barometric pressure, and humidity. It also has a real time clock, LED display, and some push buttons — it sounds like the sort of addon that we could do with down here on earth as well! It will also be equipped with both a camera module and an infra-red camera. UK school pupils are being challenged to write Raspberry Pi apps or experiments to run in space. During his mission, Tim Peake will deploy the Astro Pis, upload the winning code while in orbit, set them running, collect the data generated and then download it to be distributed to the winning teams.
56 comments | about a week ago
Home automation is a recurring topic around here; we've had stories about X-10-based home-brewed systems, a protocol designed for automation, and more than a few Ask Slashdots. Now, an anonymous reader writes OpenMotics is an open source home automation hardware and software system that offers features like switching lights and outputs, multi-zone heating and cooling, power measurements, and automated actions. The system encompasses both open source software and hardware. For interoperability with other systems, the OpenMotics Gateway provides an API through which various actions can be executed. The project was open sourced 2 years ago and was started about 10 years. The choice to open source the project was very conscious: we want to offer a system where users are in full control over their home automation system.
36 comments | about a week ago
An anonymous reader writes I recently got my hands on some amazing (at their time) pieces of technology, PocketPCs from the 2005-2007 era. All run with Windows Mobile 5 or 6, have storage SD cards (up to 4GB), 300 to 600 MHz ARM CPUs and 64-124MB of RAM/ROM. GPS chip is Sirf STAR III. I want to know what software you would install on them. Maybe a good Linux with GUI - if anyone can point on how to make it work. Creating some apps myself would be nice, but dunno where to start for WM5. One of my ideas was to use them as daily organizer / shopping list / memory games for people that don't own smartphones. So if anyone remembers such apps, I'd appreciate a reference. Tips or ideas for memory training or smart games are also highly welcomed. The power within these toys is simply unused and it's a shame!
110 comments | about a week ago
MojoKid writes Seagate's just-announced a new 'Archive' HDD series, one that offers capacities of 5TB, 6TB, and 8TB. That's right, 8 Terabytes of storage on a single drive and for only $260 at that. Back in 2007, Seagate was one of the first to release a hard drive based on perpendicular magnetic recording, a technology that was required to help us break past the roadblock of achieving more than 250GB per platter. Since then, PMR has evolved to allow the release of drives as large as 10TB, but to go beyond that, something new was needed. That "something new" is shingled magnetic recording. As its name suggests, SMR aligns drive tracks in a singled pattern, much like shingles on a roof. With this design, Seagate is able to cram much more storage into the same physical area. It should be noted that Seagate isn't the first out the door with an 8TB model, however, as HGST released one earlier this year. In lieu of a design like SMR, HGST decided to go the helium route, allowing it to pack more platters into a drive.
219 comments | about a week ago
PC Magazine reports (citing a blog post from project manager Andrew Nartker) that Google's Cardboard -- first introduced to some laughter -- is growing up, with a small but growing collection of compatible apps and a recently announced SDK. And while Cardboard itself is pretty low-tech (cardboard, rubber band, a magnet) and consequently cheap, the resulting VR experience is pretty good, which explains why more than 500,000 of them have now shipped.
28 comments | about a week ago
Nerval's Lobster writes A funny thing happened to the iPod Classic on its way to the dustbin of history: people seemed unwilling to actually give it up. Apple quietly removed the iPod Classic from its online storefront in early September, on the same day CEO Tim Cook revealed the latest iPhones and the upcoming Apple Watch. At 12 years old, the device was ancient by technology-industry standards, but its design was iconic, and a subset of diehard music fans seemed to appreciate its considerable storage capacity. At least some of those diehard fans are now paying four times the iPod Classic's original selling price for units still in the box. The blog 9to5Mac mentions Amazon selling some last-generation iPod Classics for $500 and above. Clearly, some people haven't gotten the memo that touch-screens and streaming music were supposed to be the way of the future.
269 comments | about a week ago
MojoKid writes One of the most in-your-face buzzwords of the past year has been "4K," and there's little doubt that the forthcoming CES show in early January will bring it back in full force. As it stands today, 4K really isn't that rare, or expensive. You can even get 4K PC monitors for an attractive price. There does remain one issue, however; a lack of 4K content. We're beginning to see things improve, but it's still slow going. Given that, you might imagine that display vendors would hold off on trying to push that resolution envelope further – but you just can't stop hardware vendors from pushing the envelope. Earlier this year, both Apple and Dell unveiled "5K" displays that nearly doubled the number of pixels of 4K displays. 4K already brutalizes top-end graphics cards and lacks widely available video content, and yet here we are looking at the prospect of 5K. Many jaws dropped when 4K was first announced, and likewise with 5K. Now? Well, yes, 8K is on its way. We have LG to thank for that. At CES, the company will be showing-off a 55-inch display that boasts a staggering 33 million pixels — derived from a resolution of 7680x4320. It might not be immediately clear, but that's far more pixels than 4K, which suggests this whole "K" system of measuring resolutions is a little odd. On paper, you might imagine that 8K has twice the pixels of 4K, but instead, it's 4x.
179 comments | about a week ago