Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Predicts Ubiquitous, Almost-Zero-Energy Computing By 2020

samzenpus posted about 2 years ago | from the day-after-tomorrow dept.

Intel 144

MrSeb writes "Intel often uses the Intel Developer Forum (IDF) as a platform to discuss its long-term vision for computing as well as more practical business initiatives. This year, the company has discussed the shrinking energy cost of computation as well as a point when it believes the energy required for 'meaningful computing' will approach zero and become ubiquitous by the year 2020. The idea that we could push the energy cost of computing down to nearly immeasurable levels is exciting. It's the type of innovation that's needed to drive products like Google Glass or VR headsets like the Oculus Rift. Unfortunately, Intel's slide neatly sidesteps the greatest problems facing such innovations — the cost of computing already accounts for less than half the total energy expenditure of a smartphone or other handheld device. Yes, meaningful compute might approach zero energy — but touchscreens, displays, radios, speakers, cameras, audio processors, and other parts of the equation are all a long way away from being as advanced as Intel's semiconductor processes."

cancel ×

144 comments

Sorry! There are no comments related to the filter you selected.

Gay (-1)

Anonymous Coward | about 2 years ago | (#41329489)

gay winkus in your anus

lick my bit fat toes! fat toes and fat wormwn! niggers! niggerz! niggers!

SWEET! (2, Funny)

Anonymous Coward | about 2 years ago | (#41329511)

I can't wait to overclock those chips so high that I need liquid cooling! Sounds like a fun project.

Re:SWEET! (-1)

Anonymous Coward | about 2 years ago | (#41332525)

I can spray my liquid cooling all over your mom. Comes in yellow or white colors too.

Almost? (3, Funny)

nukenerd (172703) | about 2 years ago | (#41329527)

"Almost" ?? As in "I almost saw one camel today"?

Re:Almost? (4, Funny)

bongey (974911) | about 2 years ago | (#41329577)

Schrödinger's camel ?

Re:Almost? (-1, Offtopic)

aekafan (1690920) | about 2 years ago | (#41331659)

Schrödinger's camel ?

No, Schrödinger's camel toe

Re:Almost? (1)

RicktheBrick (588466) | about 2 years ago | (#41330283)

So I guess that would mean a supercomputer like the latest IBM supercomputer would run in the average home on a wall socket. Now that would take a lot of improvement in the next 8 years.

Re:Almost? (2)

robthebloke (1308483) | about 2 years ago | (#41330413)

I'm still waiting for that 10Ghz Pentium 4 they promised.....

Re:Almost? (0)

Anonymous Coward | about 2 years ago | (#41330675)

That was my thought. For many applications a low speed lower power consumption thing is a good thing. But there are still plenty of common applications that would be better of using 65w of electricity and using the reduced energy consumption for more cores or more power.

Also, isn't this the same Intel that fails to understand that ARM is going to be very important in the future? AFAIK they're the only ones that aren't licensing the technology.

Re:Almost? (2)

fuzzyfuzzyfungus (1223518) | about 2 years ago | (#41330853)

Also, isn't this the same Intel that fails to understand that ARM is going to be very important in the future? AFAIK they're the only ones that aren't licensing the technology.

Intel purchased 'StrongARM' from DEC ages ago(back when DEC still had things you could purchase), took it through a few generations under that name and then as 'Xscale', and then sold it to Marvell 6ish years ago(with the possible exception of one flavor that they use on their RAID boards, I can't remember).

They still have an ARM license, they just aren't terribly motivated to use it. x86 doesn't have too many friends; but it certainly has a lot of customers, and Intel has somewhat... limited... incentive to march into the business of being yet another SoC shover as long as they can get away with the margins on their x86s parts and supporting silicon.

Re:Almost? (0)

Anonymous Coward | about 2 years ago | (#41330961)

They still have an ARM license, they just aren't terribly motivated to use it..

Understand that in a company full of microprocessor designers, that the idea of paying for someone else's processor design was a huge fly in the ointment. The pressure from below to get rid of it and instead sell home grown low power designs was enormous. This weeks IDF is showing some of the fruits of that decision.

Re:Almost? (2)

froggymana (1896008) | about 2 years ago | (#41331287)

But there are still plenty of common applications that would be better of using 65w of electricity...

Such as heating water for coffee or cooking breakfast?

"meaningful" (4, Insightful)

Hazel Bergeron (2015538) | about 2 years ago | (#41329603)

My Psion Series 3a computed "meaningfully" on a couple of AA batteries for days.

Re:"meaningful" (3, Insightful)

0123456 (636235) | about 2 years ago | (#41329751)

Indeed. By the time we can do today's 'meaningful computation' for almost no energy, the definition will have changed to make it as 'meaningful' as what we used to run on a 6502.

Re:"meaningful" (2)

Earl_Parvisjam (2621029) | about 2 years ago | (#41329779)

Oh yeah? Well my Cassio calculator watch computed meaningfully for two years on the same battery, back in the 80's. Intel's just jealous.

Re:"meaningful" (2)

Hazel Bergeron (2015538) | about 2 years ago | (#41330667)

Well my Tetris watch is still computing meaninglessly.

No, I lie, I think I traded it in nineteen eighty-something. Can't remember for what, though. I hope it was good. I miss that watch.

Think orders of magnitude... (3, Interesting)

Esteanil (710082) | about 2 years ago | (#41331455)

Smaller. Smaller. Smaller.

Smart Dust, is what we're talking about - or at least the early iterations.

Weather sensors that flutter in the breeze and scavenge enough energy to remain active and transmitting at most times - and the swarm *always* transmits.
Flow control sensors that oil companies continually release into their pipelines to ensure that if there's a leak they'll know where it is in milliseconds - there's transmitting sensors outside the approved geometric area.
Microscopic "Sniffers" released into the wind, measuring and reporting the amounts of cannabis, cocaine, explosives, dangerous chemicals...
Sensors to detect fire. Sensors to find out if the gas tank in that burning building is leaking at all. Just point into an air current (strong fan or wind) and let them fly from your hands.
*True* microsatellites, measured in single-digit centimetres or even smaller. (I think there's a minimum useful size for a satellite, but it's greatly related to how many of them there are, also... You could have a continual swarm reaching through the low-energy planetary transfer network keeping in contact with quite small satellites in a mesh radio network).

Making Smart Dust *safe* might turn out to be more of a challenge, though... :-)

But "really-really-low-power computing"... Alongside bio/nano-tech convergence it's the beginning of the real microbots:
Invisible cameras, as a perfect 3D image of your head emerges from the small swarm of the tiniest insects you've seen hover around your head.
Robots navigating through your bloodstream, tiny as hell - yet you've somehow ended up with the processing equivalent of your (2012) mobile phone coursing through your veins and working on any health problems you have (mostly by monitoring, at least at first).

I'm sure you guys can come up with more stuff. Please reply if you've got any ideas :-)

Re:Think orders of magnitude... (1)

Esteanil (710082) | about 2 years ago | (#41331479)

Small correction: The Interplanetary Transport Network [wikipedia.org] , not the low-energy planetary transfer network :-)
Bit tired now, good night folks.

Re:Think orders of magnitude... (1)

davester666 (731373) | about 2 years ago | (#41331957)

Hilarious.

1. Technology like this will first be used for surveillance by the gov't, because the drones they want to use now are too noticeable. It also might accidentally go places where they can't legally go, such as in your home ["but your Honor, we can't control that the defendant left his window open and some of our VideoDust happened to blow in and record his drug deal"]

2. There will always be better idiots. Like Enbridge's Michigan oil spill, where an alarm went off [hello, we have an oil spill], and the operators instead noticed that the pressure in the line was going down [also an excellent hint that there is an oil leak], but came to the decision that they should pump more oil into the line faster, to get the pressure back up.

Re:"meaningful" (1)

AmiMoJo (196126) | about 2 years ago | (#41331767)

I make data loggers that run for five or more years on a single C cell battery. Zero energy and useful work mean quite different things to different people.

Release the source (0)

Anonymous Coward | about 2 years ago | (#41329617)

I hope Intel that releases the source of the crack that they are smoking.

Re:Release the source (0)

Alex Belits (437) | about 2 years ago | (#41331697)

I hope Intel that releases the source of the crack that they are smoking.

They license it from Microsoft.

why not (0)

Anonymous Coward | about 2 years ago | (#41329647)

Computation is zero energy, for sufficient values of zero.

Let's be pedantic (1)

symbolset (646467) | about 2 years ago | (#41330751)

Computation is zero energy, for sufficient values of zero.

Since the energy inputs and outputs of digital computation are necessarily equal for all forms of computation not involving fission, fusion, zero-point energy, quantum teleportation, black holes and other such esoteric things in the computation process, yes, computation is zero energy. I would wager that the entire amount of energy created or destroyed in the process of computation by humans in all of history wouldn't amount to an entire Joule.

Computation does now require converting energy from electrical energy format to thermal energy format though. Since quite a lot of electrical energy is used to create thermal energy in the regular course of business naturally this means that peak computation electrical energy efficiency can be improved not just by increasing the computations per KWh but also by putting the computation in the place where you wanted the thermal energy anyway, or using the thermal energy once you have it for some other purpose. That way you get to use the same watt twice at no additional cost. Fortunately Intel is already on this one too. [intel.com]

Heating homes via computers (1)

Paul Fernhout (109597) | about 2 years ago | (#41331393)

"Since quite a lot of electrical energy is used to create thermal energy in the regular course of business naturally this means that peak computation electrical energy efficiency can be improved not just by increasing the computations per KWh but also by putting the computation in the place where you wanted the thermal energy anyway, or using the thermal energy once you have it for some other purpose. That way you get to use the same watt twice at no additional cost."

A related idea I had:
http://hardware.slashdot.org/comments.pl?sid=2344998&cid=36859662 [slashdot.org]
"(I'll give away an idea here as a patent-preventing disclosure that I've been hoarding. :-) You could have this or any other local industrial process be thermostat controlled (or predictively controlled, or timer controlled, or some combination), so if your house or facility needs more heat you run the process; and if your building is hot enough for your needs, you don't run it, thus using local industrial-like processes to regulate your homes climate. For processes that absorb heat you could do the inverse for air conditioning. You can do that with networked computers too, so if you need heat you do local computation for the network, if you don't need heat, you shut those processors down. Special processor units or industrial process units for various purposes could be designed to replace regular electric baseboard heaters or central furnaces. So, essentially, industry is running for no extra energy charge where people use electricity to heat, and it runs at a subsidy where people use currently cheaper ways to heat like oil or gas or wood. And sometimes you might want to produce stuff anyway, and so you would need to dump the waste heat or use it in some other way or store it in some thermal storage system like a water mass or sand mass or phase changing salts or other such system, with the stored heat being used as part of the thermoregulatory planning. Of course, if you insulated your home well, you might not need a furnace, so there are economic limits to this idea as people improve their infrastructure in other ways...)
        This would totally change how agriculture was done. Instead of having lunar moonscapes like Iowa is part of the year, people would just produce their own agricultural liquids in neighborhood facilities or at home, using the local waste heat for other purposes as well. Most agricultural lands could be returned to wilderness. The total energy bill for a home might not go up very much using the above idea for thermostatic regulation. "

Re:Heating homes via computers (2)

symbolset (646467) | about 2 years ago | (#41331803)

You should probably patent that.

nice (an nitpick) (5, Insightful)

DMiax (915735) | about 2 years ago | (#41329653)

touchscreens, displays, radios, speakers, cameras, audio processors, and other parts of the equation are all a long way away from being as advanced as Intel's semiconductor processes

It may not be possible at all to lower the power consuption of certain devices below a certain absolute threshold. No matter how advanced, a WiFi device has got to consume at least the power needed to reach other devices. A backlit screen will use at the very least the power it emits in light, etc... It is not simply a matter of technological advances.

That said: amazing prospect. Hope it's not just bold claims no substance. It would really be fantastic.

Re:nice (an nitpick) (2)

arbiter1 (1204146) | about 2 years ago | (#41329839)

I would guess its intel talking about cpu side of things, the rest they don't really have control of R&D wise cept wifi on laptops but intel has some some decent power saving on wifi side of things in their chipsets

Re:nice (an nitpick) (1)

lister king of smeg (2481612) | about 2 years ago | (#41331107)

They would need to have more than control of the R&D of the wifi chips they would need control of laws of physics. emitting a wifi radio signal requires significant energy expenditure because you are emitting that energy.

Re:nice (an nitpick) (1)

Noughmad (1044096) | about 2 years ago | (#41331913)

But with more sensitive receivers, the transmitter has to emit less energy. So improvements are possible.

Re:nice (an nitpick) (1)

Idbar (1034346) | about 2 years ago | (#41329845)

Overall, you're right. I wonder if they can also be thinking about the "perceived" power consumption. I'm guessing the phone tower transmitting to you may also transmit enough power such that it can power your phone for long enough, and even re-charge it if the whole system is efficient.

Re:nice (an nitpick) (2)

petermgreen (876956) | about 2 years ago | (#41330557)

Radio losses are a bitch, only a tiny fraction of the original energy is left after a typical radio path.

So if you try and power your device over radio you end up with a tiny fraction of the original energy, then losses in conversion and then a tiny fraction of what is left after those losses getting back to the base-station. I'ts just not practical except for very short distance links to very low power devices (think: rfid tags).

Re:nice (an nitpick) (5, Interesting)

kaiser423 (828989) | about 2 years ago | (#41331307)

RF Engineer here. Let's put this in perspective. Your typical cell phone will receive somewhere around -50dBmW maximum. That's typically 4-5 full bars of reception. My phone is sitting next to me right now running -88dBmW, and that's two bars.

So, let's say that you're receiving that -50dBmW signal. -50dBm is -80dBW. Let's convert that straight to Watts now, so 10^(-80/10). That's 1e-8 Watts, or 80 nano-Watts. Good luck charging your phone with that.

That's also why you see RF being used everywhere. The dynamic range is huuuuuuuuge! Your cell phone can transmit +30dBm or more, and you can reliably receiving -80dBm. So, you're able to transmit Watts pretty easily, and receive nano Watts pretty easily. Yea, path loss can be a lot, but you've got a lot of headroom to deal with. That's just in the palm of your hand. Add in big, megawatt amplifiers and huge dishes with large, sensitive electronics and it's no wonder that we can reach out billions of miles. Really mind-boggling stuff if oyu stop to think about it.

Re:nice (an nitpick) (1)

wvmarle (1070040) | about 2 years ago | (#41332517)

Why not just use a crystal receiver? No batteries needed, no charging, gets all the power from the radio mast.

May not work so well on mobile phone frequencies and so, but then the never-recharge feature should trump that minor inconvenience! AM radio ftw :-)

Re:nice (an nitpick) (1)

timeOday (582209) | about 2 years ago | (#41329981)

Wifi could probably use much less power if it were able to dynamically steer a high-gain antenna towards the base.

Or maybe devices could use optical signaling instead. Imagine if the device used modulation of a mirror (perhaps by putting an LCD in front of it) so it could do bidirectional communication using only reflected energy (sapping only a tiny amount of energy for the modulator). This could be done opportunistically; if you're indoors and one of these IR transceivers is overhead use that, otherwise fall back to wifi; otherwise fall back to a radio cell.

I guess my point is some of these "hard physical limits" can become insignificant if you have enough infrastructure.

Re:nice (an nitpick) (3, Interesting)

aNonnyMouseCowered (2693969) | about 2 years ago | (#41330543)

"A backlit screen will use at the very least the power it emits in light, etc... It is not simply a matter of technological advances." Our technologically won't be sufficiently advanced unless it's as energy efficient as nature. How much energy does a bioluminiscent fish consume? I often read about the brain being compared to a light bulb, and not just because of the Edison "invention" connection. Cellphones already consume less energy than a 5W lightbulb but are nowhere near as powerful as the MacDonald's-powered supercomputer inside our heads. Maybe the trick isn't getting as near to zero energy as physically possible but making our information devices sophisticated enough to recharge itself using whatever "free" energy source is available, be that the heat and radiation of the sun, the kinetic energy of a jogger, or the mere act of carrying the cellphone in your pocket while walking on the way to the office.

Re:nice (an nitpick) (1)

Immerman (2627577) | about 2 years ago | (#41331075)

The brain really isn't remarkably energy efficient - sure it compares favorably to current tech, but its lead is shrinking rapidly. After all it's responsible for roughly 20% of your body's total energy consumption, which assuming a BMR of 1300 kCalories/day that's an average energy consumption of almost 13W. And I've heard that championship-level chess players can burn as many as 5000kCal/day during a tournament, which would suggest an additional 180W of average energy consumption, with peak consumption probably being at least 2-3 times that.

Re:nice (an nitpick) (1)

O('_')O_Bush (1162487) | about 2 years ago | (#41331405)

Rapidly shrinking in comparison to standstill, sure, but what does that matter in comparison to a technology lightyears away? We are still 15-30 years away from being able to model the synapses, maybe 50 from the full brain. And even then, without supercomputing, it would be drawing on the GW scale.

Our brains are the product of many millions of years of design improvements, as the less efficient the brain is, in power or power usage, the smaller the chances of survival. I doubt we will ever reach brain efficiency on silicon.

Re:nice (an nitpick) (2)

scheme (19778) | about 2 years ago | (#41331719)

The brain really isn't remarkably energy efficient - sure it compares favorably to current tech, but its lead is shrinking rapidly. After all it's responsible for roughly 20% of your body's total energy consumption, which assuming a BMR of 1300 kCalories/day that's an average energy consumption of almost 13W. And I've heard that championship-level chess players can burn as many as 5000kCal/day during a tournament, which would suggest an additional 180W of average energy consumption, with peak consumption probably being at least 2-3 times that.

That's wrong. Triathletes and cyclists doing long races can go through 5000kCal/day. Chess players don't come close. They're around 100-120kCal/hr at most [nih.gov] .

Re:nice (an nitpick) (1)

Nemyst (1383049) | about 2 years ago | (#41331459)

Sometimes I think it might actually be a good idea to find a way to "power" your devices using your own energy - so finding a way of converting ATP into electricity for electronic devices. Out of juice? No problem, just eat a sandwich and you're good to go!

As a bonus, you'd be able to say you lost weight by running folding@home. Solves obesity problems and advances science!

Re:nice (an nitpick) (1)

ArsonSmith (13997) | about 2 years ago | (#41332113)

I'm not fat, me electronics just have a long battery life.

Re:nice (an nitpick) (1)

wvmarle (1070040) | about 2 years ago | (#41332545)

Not sure how much the brain uses but it's probably an order of magnitude more.

And when it comes to computing, computers win. When it comes to pattern recognition, brains win. Just have to use the best tool for the job.

Re:nice (an nitpick) (1)

godrik (1287354) | about 2 years ago | (#41331373)

I am sure there is bottom limit that we will not be able to pass. But how low is that limit actually? We are pushing the efficiency of all our technology way down. in recent screen technologies: LCD, LED, e-ink. Recent storage technology: flash, SSD, NVRAM.

Making smartglasses flash with a raspberry pi, the glasses are tainted with a e-ink type of display (is that possible on glasses? I don't know the technology enough), the storage is on a SD card, and the input control is a microphone. That could in total cost less than 10W. Ok there is no networking included in that, I don't know much about lower power network interfaces, but bluetooth is cheap and you have a phone in your pocket (if you have fancy glasses, you most likely do have a fancy phone in your pocket).

We are not that at "no measurable power footprint", but we are getting pretty low.

Re:nice (an nitpick) (0)

Anonymous Coward | about 2 years ago | (#41331429)

A backlit screen will use at the very least the power it emits in light

So why back light the screen? An eInk screen uses no power to display. It only consumes power to change the display. The current problems are that color is expensive (will probably improve over time) and that screen changes are slow (has also been improving). This has left it to a limited set of uses, mainly for ereaders, which display the same screen for some time. Amazon just released a front lit version which still uses less power when lit than a back lit screen.

It seems possible that eInk will continue to become cheaper, faster, and use less power. It may even be possible to make the contrast high enough that additional lighting is not needed (glow-in-the-dark ink?). You are probably right that this is not possible with a backlit screen, but why do you need a backlight? That's only one possible technical implementation to meet your needs.

Your point about WiFi is better, although there is probably a lot of room for improvement there as well. Most devices probably do not need to be connected most of the time. Many devices are stationary and do not need WiFi (my Nintendo Wii uses WiFi but my TiVo is connected via wire and right next to it). If Bluetooth becomes ubiquitous, it can replace WiFi for many applications. Yes, range is more limited, but with ubiquitous connections, you don't need range.

Re:nice (an nitpick) (0)

Anonymous Coward | about 2 years ago | (#41332173)

For WiFi you could imagine the device figuring out where the connection point is and using a beam directed specifically in that direction, instead of sending a wasted signal in all directions. The screen doesn't need to be backlit, though you could imagine a wifi connection (again with directed beam) to an implant in your brain that makes you see a bright screen even though it isn't actually bright - or maybe there is no screen at all, you just see one. Or the whole thing could be running in your head in the first place. You could also have a laser from the room shine into a place on the device and that light could be used - though that's essentially drawing power from outside the device, so you could consider that cheating, but I think cheating is fine. Wireless charging could be used to run devices indefinitely that require power but don't have any battery We are no where near the end of what nature will allow.

Re:nice (an nitpick) (1)

wvmarle (1070040) | about 2 years ago | (#41332497)

A backlit screen will use at the very least the power it emits in light, etc..

The problem is that we're still stuck with backlit screens.

The next logical step is that the back light can go, like in current e-ink. Sure there's quite a way to go, but I'm positive we'll eventually get there. A display that has vivid colours, fast refresh rates, no afterglow, and uses ambient light (reflective) to be readable. Power consumption: almost zero.

Sidestepping? (4, Insightful)

Riddler Sensei (979333) | about 2 years ago | (#41329725)

I wouldn't say that Intel is sidestepping those problems because they're not THEIR problems to address.

Re:Sidestepping? (0)

Anonymous Coward | about 2 years ago | (#41330489)

"I wouldn't say that Intel is sidestepping those problems because they're not THEIR problems to address."

Proper use of "they're" and "their" one word apart in the same sentence.

Am I really still on the internet, or does the matrix have me?

Kudos to you, sir.

I seriously doubt this. (0)

Anonymous Coward | about 2 years ago | (#41329761)

They will find a way to measure it.
Besides, once you reach the point where improving processor efficiency leads to negligible improvements in overall power consumption, the focus will shift to improving the other components, and it won't be worth the cost to further improve the processor.

Concept from Sci-Fi (1)

Stiletto (12066) | about 2 years ago | (#41329771)

Sounds like he's talking about "localizers" [wikipedia.org] . Probably similar stuff found in other books as well.

Color e-ink. (0)

Anonymous Coward | about 2 years ago | (#41329783)

I'm just waiting for color e-ink.
Now _that_ will be some major change in everything.

Re:Color e-ink. (1)

cykros (2538372) | about 2 years ago | (#41331369)

Even better: Bio-luminiscent color e-ink, to remove the need to use as much power for a display that is yet still visible in the dark (though to be fair, there are some VERY energy-efficient booklights on the market which do the job well enough, even if a bit clunky). As for what kind of stimulus would be needed to trigger the bioluminescence, well, that's beyond my knowledge.

I'm more optimistic (4, Insightful)

SoftwareArtist (1472499) | about 2 years ago | (#41329795)

Yes, meaningful compute might approach zero energy — but touchscreens, displays, radios, speakers, cameras, audio processors, and other parts of the equation are all a long way away from being as advanced as Intel's semiconductor processes.

I think the author misunderstood what "ubiquitous" means. It means you can put serious computing power anywhere, including in places that don't have displays, cameras, etc. He's just thinking, "How far can they reduce the power use of my existing smartphone?" The real question is, "What completely new types of devices become practical when computing requires hardly any power at all?"

Also, the situation is better than he suggests. Bright, super high resolution LED or LCD displays take a lot of power, but eInk displays use hardly any power at all. That's why battery life is measured in hours for an iPad and in weeks for a Kindle. LTE radios use a lot of power, but 3G is fine for most applications, and even 2G is more than sufficient in many cases (not for web browsing, but for a device that just needs to exchange limited data with the outside world).

Re:I'm more optimistic (1)

Anonymous Coward | about 2 years ago | (#41330013)

Counter point: I work in the utility industry, which is rolling out well over a million smart meters. (does that count as ubiquitous?)
Each of these meters has a 3G cell phone to call home, to report meter readings & the like. 2G would be cheaper, but the cell companies can't promise that 2G wi be around for the life of the meter (under 10 years).
Also take into account remote firmware updates, and 2G just won't cut it

Re:I'm more optimistic (1)

petermgreen (876956) | about 2 years ago | (#41330685)

Smart meters have the advantage of being on-grid pretty much by definition. On-grid electricty is cheap.

off-grid small scale electricity is far more expensive. You either use primary batteries (only practical for very low consumption levels) or you use rechargable batteries and try to find some way to recharge them (not cheap to set up).

Re:I'm more optimistic (1)

RightSaidFred99 (874576) | about 2 years ago | (#41331907)

2G is fine for remote firmware updates unless you need _instant_ remote firmware updates or your firmware is 2 gigs of data.

Re:I'm more optimistic (0)

Anonymous Coward | about 2 years ago | (#41330767)

There already exist microcontrollers that use less than a few uA of current.

http://www.microchip.com/pagehandler/en-us/technology/xlp/ [microchip.com]

    * 30uA/MHz power usage
    * 9 nA sleep mode

That is low power. That is 284mAh per year in sleep mode. Or less than 1000 mAh/year running at 1MHz.

Re:I'm more optimistic (1)

Kjella (173770) | about 2 years ago | (#41332031)

I think the author misunderstood what "ubiquitous" means. It means you can put serious computing power anywhere, including in places that don't have displays, cameras, etc. He's just thinking, "How far can they reduce the power use of my existing smartphone?" The real question is, "What completely new types of devices become practical when computing requires hardly any power at all?"

Well, as a counterpoint I would say we could have turned everything connected to the AC grid into "smart" devices already, but despite many, many house of the future concepts pretty much everything I see in stores is regular old dumb devices anyway. So yes maybe with extremely low power we could turn everything into a "smart" device, but I still have my doubts that we actually will.

Humans are expensive (0)

Anonymous Coward | about 2 years ago | (#41329805)

"touchscreens, displays, radios, speakers, cameras, audio processors"... i.e., things humans require to exchange information.

Meanwhile, the things computers need to exchange information remain cheap.

When bio-brains are more expensive than other, more powerful brains, things do not bode well for hu-mans!

Surprised (0)

Anonymous Coward | about 2 years ago | (#41329877)

The reading comprehension ability of most /. editors is such that I am surprised the headline wasn't "Zero-point Energy Powered Computers Nearly Here"

Near zero energy cost == singularity (2)

greg_barton (5551) | about 2 years ago | (#41330027)

When people think of the limits of strong AI (if they do at all) they generally focus on how complex it must be to create.

Complexity, however, is not the limiting factor. It is the fact that existing computers, compared to the brain, are energy hogs of epic proportions. The brain's energy use is on the order of millions of times more efficient than even the most power stingy CPU. Even of we knew how to accomplish strong AI we couldn't power the computer capable of supporting it.

That is, however, unless Intel reaches it's goals.

Re:Near zero energy cost == singularity (0)

Anonymous Coward | about 2 years ago | (#41331195)

Bullshit? Aren't basic super computers (pick any from the Top 500) massively better than estimates for human computation? Isn't there consumption measured in the kilowatts/low megawatts? Total US power production [eia.gov] is in the gigawatts. You are off by several orders of magnitude.

Re:Near zero energy cost == singularity (0)

Anonymous Coward | about 2 years ago | (#41332193)

Our brains are very similar to other animals. I wonder how much of our brain is tasked with doing things that we really don't NEED strong AI to be doing - for example walking around, breathing, regulating our stomachs, sensing what's touching our skin, smelling things, survival reflexes. I wouldn't be surprised if what we consider to be "higher thinking" is really only done by a small proportion of our brain and is only consuming a small proportion of the total energy of the brain. If it is 0.1%, you could cut off 3 orders of magnitude from your estimate, though it might also be 10%, I don't actually know. Does anyone know? I'd be happy with a computer that computes at 1/1000 speed of a human as long as it was as smart as a human. That could be a factor of a million in power consumption right there. Also, I'd be happy with an AI that consumed a MW of power, which gives you, what another 10^5? I think power consumption is simply not relevant at all to the issue of creating strong AI - it is only relevant if you want to put one of those in every pocket or something like that. The issue is what algorithm you make the AI with. If I had to guess, I think that we have far more than enough computation power available to support strong AI, we just don't have the software.

Re:Near zero energy cost == singularity (1)

greg_barton (5551) | about 2 years ago | (#41332221)

How can one have common sense without experiencing the common?

And we certainly don't have the software. What I'm saying is that even if we did we couldn't run it. But maybe once our computers are power stingy enough we will be able to.

Reversible Computing (0)

Anonymous Coward | about 2 years ago | (#41330037)

According to Ray Kurzweil, reversible computing can consume essentially zero energy.

Re:Reversible Computing (1)

hajus (990255) | about 2 years ago | (#41332061)

I've been following Kurzweil for years, and I've familiar with reversible computing, but I've never heard of him commenting on that topic. His arguments mostly stem from exponential growth of power. Citation? I'd like to read his words.

Like it isn't already ubiquitous (1)

kilodelta (843627) | about 2 years ago | (#41330045)

If I cast a net 10 feet around me I have 9 devices with CPU's in them. Only two of them are full up computers. Another two are smartphone. So the other five are my amateur radio gear, cable box, Wii, Xbox and TV.

Interestingly the kitchen and dining room are the only areas with the fewest CPU's in it. The office has a half dozen ATMega's as Arduino platforms, a TI Chronos Watch, Stellaris Robot, MPS430, and the oddest of all the CPU in the Western Electric 1D2 pay phone I own.

Re:Like it isn't already ubiquitous (1)

lister king of smeg (2481612) | about 2 years ago | (#41331183)

you might want to check the kitchen again your oven might have one if it is digital, your microwave probably has one. hell my blender has one. they may be crappy 8 bit ones but they are cpu's. (then you have got your FreeBSD toster)

Thermodynamics (1)

tsotha (720379) | about 2 years ago | (#41330075)

Aren't there some fundamental physical limits on how low your energy usage can be for a given amount of information based on thermodynamics? Is it just the case that they're way, way less than what we're using now?

Re:Thermodynamics (4, Informative)

gotfork (1395155) | about 2 years ago | (#41330175)

Aren't there some fundamental physical limits on how low your energy usage can be for a given amount of information based on thermodynamics? Is it just the case that they're way, way less than what we're using now?

For any sort of data storage the energy barrier between the two states needs to be large enough that the system doesn't thermodynamically fluctuate between them very often. In practice, this means that the barrier needs to be several times larger than kb*T where kb is the boltzman constant. For computation there's not any hard and fast rule about the energy required, but there's lots of practical ones...

Re:Thermodynamics (2)

TeknoHog (164938) | about 2 years ago | (#41332087)

For any sort of data storage the energy barrier between the two states needs to be large enough that the system doesn't thermodynamically fluctuate between them very often. In practice, this means that the barrier needs to be several times larger than kb*T where kb is the boltzman constant. For computation there's not any hard and fast rule about the energy required, but there's lots of practical ones...

Actually, there is a very similar limit for computation. In most of our computing, we destroy a lot of information, and thus entropy is created. For example, adding two 64-bit numbers to produce a third one -- you've just lost 64 bits of information. For each bit lost, you generate about kb*T of heat.

The general idea to counter this problem is called reversible computing, but I'm not sure how it would work in practice, as you'd have to store a lot of useless information.

Zero energy consumption... (0)

Anonymous Coward | about 2 years ago | (#41330081)

I predict zero energy consuming *everything* by 2020, since all the lights and energy production is going to collapse by then anyway.

Well, I guess you could count burning wood in fireplaces to stay warm after the zombie apocalypse as energy consumption, eh?

Re:Zero energy consumption... (1)

rrohbeck (944847) | about 2 years ago | (#41330163)

Well you can always use Solar or one of these [biolitestove.com] .

Re:Zero energy consumption... (0)

Anonymous Coward | about 2 years ago | (#41330733)

I'm sure it works, but who wants to carry wood with them when backpacking? In my part of the world, it's illegal to collect wood for fires. And during some parts of the year it's illegal to have a fire at all.

Glasses (2)

Dan East (318230) | about 2 years ago | (#41330097)

Regarding energy requirements for a display and touchscreen, those are both greatly reduced with glasses (which, owing to their small size, are also the devices for which low power consumption is most important). Glasses are much closer to the eye, and ideally can direct the light directly to the eye. Modern displays are designed for maximum angle of visibility - they spew light over 180 degrees, on purpose, so they can be viewed from almost any angle. They are inefficient by design. So glasses can use much, much less power for display because they can be optimized in a number of ways.

Obviously glasses cannot make use touchscreens either, but instead use voice input, accelerometers, etc, which are hardware that require very little power.

I call marketing BS (1)

rrohbeck (944847) | about 2 years ago | (#41330145)

Cutting down power consumption by some factor == "Almost zero"?
This reduction in power will easily be made up for by more and bigger applications.
I think this is a shot against ARM. If anybody should talk about low power computing it's them. ARM with new tech like 22nm 3D multi-gate transistors will be *really* low power. Not Haswell & co.

Re:I call marketing BS (1)

RightSaidFred99 (874576) | about 2 years ago | (#41331921)

Only, well, ARM doesn't have that tech and won't for years. Intel owns the low power future because it's mostly about manufacturing technology, and they have it.

Re:I call marketing BS (0)

Anonymous Coward | about 2 years ago | (#41332433)

What the heck...

ARM licenses their designs to whoever wants to build the chips. So if you are a manufacturer that has the manufacturing technology you speak of, you license the ARM core and build away.

well (1)

Charliemopps (1157495) | about 2 years ago | (#41330217)

There are plenty of alternatives to audio and video that use very little power. The kindles e-ink screen for example... there's progress in delivering audio directly to the skeletal structure of your head, therefor using far less power. Wifi, GPS and cellular signals are where the real problems lay.

I thought that it could be theoretically computed (2)

mark-t (151149) | about 2 years ago | (#41330595)

What the minimum amount of energy to perform a calculation was. I forget where I saw it, but I seem to remember it having to do with an equivalence of energy to information, which is to say that a certain (non-whole) number of bits could be represented per unit of energy. A minimum amount of energy it would require to reliably change a single bit can be reasonably be derived from this. Using a turing machine to model a calculation and counting the cycles that it takes to complete, you could then calculate the minimum amount of energy needed to perform that calculation.

Although for trivial operations, the energy requirements are absurdly tiny fractions of a joule, I might suggest that for modern complex computing that we perform today, those minimum energy requirements aren't going to be anywhere as near to zero as they expect.

The only way it will really "approach" zero, is if we start demanding less from computing devices. This may be happening in some areas already, but I wouldn't say it's a ubiquitous phenomenon.

Re:I thought that it could be theoretically comput (1)

TuringTest (533084) | about 2 years ago | (#41331969)

You wouldn't use a Turing machine to model the minimum energy need of calculations, as they are woefully inefficient; in the same way that you wouldn't model addition representing the naturals through the Successor function. [wikipedia.org]

The Turing machine was (is) a reasonably good tool to create proofs for the existence (or nonexistence) of computations, as it provides a quite simple and general computation model, easy to work symbolically with. But near the minimum use of resources, it isn't.

Re:I thought that it could be theoretically comput (1)

Kjella (173770) | about 2 years ago | (#41332053)

A minimum amount of energy it would require to reliably change a single bit can be reasonably be derived from this. Although for trivial operations, the energy requirements are absurdly tiny fractions of a joule, I might suggest that for modern complex computing that we perform today, those minimum energy requirements aren't going to be anywhere as near to zero as they expect.

It's the Landauer's principle [wikipedia.org] but it's an extremely low limit. To quote WP:

At 25C (room temperature, or 298.15 kelvins), the Landauer limit represents an energy of approximately 0.0178 eV, or 2.85 zJ. Theoretically, room-temperature computer memory operating at the Landauer limit could be changed at a rate of one billion bits per second with only 2.85 trillionths of a watt of power being expended in the memory media.

When you say "zero" (4, Informative)

Hatta (162192) | about 2 years ago | (#41330801)

You don't really mean zero. There is a fundamental minimum [wikipedia.org] amount of energy it takes to do a calculation. When Intel says "almost zero energy computing" how far over this limit are they actually talking about? 101% of the Landauer limit? 200%? 1000%?

Re:When you say "zero" (1)

MyLongNickName (822545) | about 2 years ago | (#41331163)

Well, considering:

"Theoretically, roomtemperature computer memory operating at the Landauer limit could be changed at a rate of one billion bits per second with only 2.85 trillionths of a watt of power being expended in the memory media."

Even if we approached 1 million percent of the fundamental minimum, we'd be doing pretty well.

Re:When you say "zero" (0)

Anonymous Coward | about 2 years ago | (#41331251)

Neat! I didn't know about such an idea before, thanks for sharing. Time for some napkin math.

If DDR3 can do 12,800 MB/s, we're talking in the neighborhood of 13,421,772,800 bits per second, we're in the 10e-12 neighborhood of kilowatt hours. Even if they're a million or a billion times worse than this (and you're considering CPU local cache is even a thousand times faster further), it's gonna be super cheat, less than 1 KWH -- typically $0.08 cents per kilowatt hour here in the US. I'm not sure how many other bits have to get toggled for that memory to have changed (local caches, busses), but at this point it'd be worth it to compute the waste-heat from the resistance in the wiring in the motherboard.

The real question is, what will this do to BITCOIN?????

Re:When you say "zero" (1)

Namarrgon (105036) | about 2 years ago | (#41331299)

Then again, there's a lot of interest in reversible computing [wikipedia.org] , which sidesteps Laundauer's limit to some extent.

Re:When you say "zero" (1)

godrik (1287354) | about 2 years ago | (#41331397)

of course there is a lower bound to the energy you need. But low energy enough that body heat or ambient temperature powers it is close to zero enough for me. That is what they are talking about: so low that the actual value does not matter.

Who forgets history is condemned to repeat it (1)

Tijaska (740114) | about 2 years ago | (#41331149)

When John von Neumann and his colleagues announced the world's first general purpose programmable vacuum tube computer he was asked how many the world might need, and guessed about 24. He was right and he was wrong. 24 of those machines would have handled most of the serious number crunching then taking place. But the machines brought about a radical reduction in the cost of computing, and demand exploded.

run on zero energy (0)

rossdee (243626) | about 2 years ago | (#41331169)

In other news, the next republican administration has plans to repeal the Laws of Thermodynamics.

Jevons paradox says (3, Interesting)

doug141 (863552) | about 2 years ago | (#41331339)

this may cause an increase in energy used for computations. http://en.wikipedia.org/wiki/Jevons_paradox [wikipedia.org]

Re:Jevons paradox says (2)

BlackPignouf (1017012) | about 2 years ago | (#41332133)

Exactly my thought:

Look! Those glasses only use 0.1 Watt for computing, that's almost-zero-energy-we-re-gonna-save-the-world-with-our-super-green-glasses!
Guess what? If those cool but kinda useless devices didn't exist yesterday, you didn't exactly save energy.

Re:Jevons paradox says (1)

HungryHobo (1314109) | about 2 years ago | (#41332421)

unless their presence allows someone to save energy in some other way.

if the map in your glasses mean you don't take a wrong turn and end up having to drive 15 minutes back then they've just saved lots of energy.

if the cell phone you carry allows you to call the delivery guy and let him know the order has been cancelled before he gets somewhere with a landline then you've just saved the energy cost of moving a truck miles.

if that computer system in the warehouse tracks items better than a human operator then you've just saved the energy that would have been wasted when the paperwork for a pallet full of stock is lost and it goes bad before anyone notices.

that video link and high bandwidth connection may take a bit of energy but if it means one engineer doesn't need to catch a plane out to fix a problem in a factory somewhere then it's energy cost gets paid back a hundred times over.

a Google search uses just about the same amount of energy that your body burns in ten seconds but if you save a trip to the library or avoid a pitfall in some project as a result it can save a massive amount of energy overall.
http://googleblog.blogspot.co.uk/2009/01/powering-google-search.html#!/2009/01/powering-google-search.html [blogspot.co.uk]

Overflow? (0)

Anonymous Coward | about 2 years ago | (#41331505)

Their next few chips will consume so much energy that the energy counter to will overflow. I hope it's signed, then we can get some free energy until the exploit is fixed.

They should patent that idea (1)

Tony Isaac (1301187) | about 2 years ago | (#41331637)

Never mind that they haven't done it yet, or that the idea may be just pie in the sky. They had the idea first, so they should patent it. That way, if by some miracle somebody does do it one day, they can sue the pants off of them. That seems to be the way things are done these days.

Re:They should patent that idea (0)

Anonymous Coward | about 2 years ago | (#41332449)

There exists prior art in the form of ARM cpus.

Oculus Rift? (1)

DarwinSurvivor (1752106) | about 2 years ago | (#41331677)

Why would the Oculus Rift need this type of low power consumption? They do know it gets "plugged in" right?

Battery life declining (1)

Animats (122034) | about 2 years ago | (#41331723)

Then how come the newer iPhones have worse battery life than the old ones?

shanghai (-1)

Anonymous Coward | about 2 years ago | (#41332023)

Shanghai Shunky Machinery Co.,ltd is a famous manufacturer of crushing and screening equipments in China. We provide our customers complete crushing plant, including cone crusher, jaw crusher, impact crusher, VSI sand making machine, mobile crusher and vibrating screen. What we provide is not just the high value-added products, but also the first class service team and problems solution suggestions. Our crushers are widely used in the fundamental construction projects. The complete crushing plants are exported to Russia, Mongolia, middle Asia, Africa and other regions around the world.
http://www.sandmaker.biz
http://www.shunkycrusher.com
http://www.jaw-breaker.org
http://www.jawcrusher.hk
http://www.c-crusher.net
http://www.sandmakingplant.net
http://www.vibrating-screen.biz
http://www.mcrushingstation.com
http://www.cnstonecrusher.com
http://www.cnimpactcrusher.com
http://www.Vibrating-screen.cn
http://www.stoneproductionline.com
http://www.hydraulicconecrusher.net
http://www.mcrushingplant.com
http://www.crusher007.com
http://www.sand-making-machine.com
http://www.china-impact-crusher.com
http://www.cnshunky.com
http://www.bestssj.com
http://www.shunkyen.com
http://www.crusheren.com
http://www.crusher02.com
http://www.portablecrusherplant.net
http://www.csconecrusher.com

Tesla Energy? (0)

Anonymous Coward | about 2 years ago | (#41332365)

How much Tesla Energy, electricity harvested from background could be produced and how does this compare the consumption requirements, to drive micro circuits.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>