Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Sacrificing Accuracy For Speed and Efficiency In Processors

Soulskill posted more than 5 years ago | from the whi-mnds-a-fwe-erorsr dept.

Math 499

Skudd writes "Modern computing has always been reliant on accuracy and correct answers. Now, a professor at Rice University in Houston posits that some future applications could be revolutionized by 'probabilistic computing.' Quoting: 'This afternoon, Krishna Palem, speaking at a computer science meeting in San Francisco, will announce results of the first real-world test of his probabilistic computer chip: The chip, which thrives on random errors, ran seven times faster than today's best technology while using just 1/30th the electricity. ... The high density of transistors on existing chips also leads to a lot of background "noise." To compensate, engineers increase the voltage applied to computer circuits to overpower the noise and ensure precise calculations. Palem began wondering how much a slight reduction in the quality of calculations might improve speed and save energy. He soon realized that some information was more valuable than other information. For example, in calculating a bank balance of $13,000.81, getting the "13" correct is much more important than the "81." Producing an answer of $13,000.57 is much closer to being correct than $57,000.81. While Palem's technology may not have a future in calculating missions to Mars, it probably has one in such applications as streaming music and video on mobile devices, he said.'

Sorry! There are no comments related to the filter you selected.

Bank balance (5, Funny)

johnny cashed (590023) | more than 5 years ago | (#26773965)

And $81,000.31 is a much more correct answer!

Re:Bank balance (4, Insightful)

CastrTroy (595695) | more than 5 years ago | (#26774003)

I agree. The whole problem with the example given in the summary is that your bank balance should never be wrong. There is no room for error in calculating bank balances. I also don't want to hear skips and pops in my music because they though it would be more energy efficient to use a processor that produced errors. I already get 26 hours of charge out of my MP3 player. I'd rather have them focus on getting more space for cheaper so I can carry lossless audio on my portable mp3 player.

Re:Bank balance (5, Insightful)

BadAnalogyGuy (945258) | more than 5 years ago | (#26774039)

If you are listening to music on a portable media device, it's safe to say that you aren't going to be able to hear the difference between the lossy format and the lossless format.

It's like drinking from a well. Connoisseurs may claim to be able to taste the difference between it and tap water, but that's just the extra tang from all the bull shit.

Re:Bank balance (1)

Jeff DeMaagd (2015) | more than 5 years ago | (#26774065)

I expect financial calculations to be accurate to the penny, or even calculated to the third or fourth decimal place, then rounded to the nearest 2nd decimal place.

But I agree, audio & video playback and other things are different. An occasional error on the 15th or 16th bit isn't going to be audible in real-world portable circumstances.

Re:Bank balance (4, Interesting)

phoenix321 (734987) | more than 5 years ago | (#26774409)

High accuracy is required for encoding music and video, though.

Maybe we could have a selective accuracy, where programmers can set their needs via registers or direct it to different CPU cores. Accurate cores for banking transactions, AV-stream encoding and 2D GUI operations while inaccurate cores are used for AV-stream decoding, and computer game 3d drawing and AI decisions.

There's a whole lot we are calculating now without the need for more than 3 significant digits - and a whole bunch where we intentionally use random numbers, sometimes even with strong hardware entropy gathering.

These are all cases where we could just scrap the accuracy for faster processing or longer battery times. No one cares about single bit errors in portable audio decoders or in high fps 3d gaming.

Or maybe we could... (2)

PaulBu (473180) | more than 5 years ago | (#26774585)

Maybe we could have a selective accuracy, where programmers can set their needs via registers or direct it to different CPU cores ... just keep using (say, in C) short, int, long, long long (no, AV codecs should not require floating point, but is you wish, there are floats, doubles, long doubles, etc.).

Of course proper implementation of some AV decoder on a modern processor will use available SIMD instructions (MMX and friends) where programmer can easily trade off accuracy (in byte-sized chunks) for speed.

Paul B.

Re:Bank balance (5, Insightful)

Hojima (1228978) | more than 5 years ago | (#26774467)

There are countless applications for a computer that don't depend on accuracy, but do depend on speed. For example: gaming, stock analysis, scientific/mathematical research etc. Just about every use for the computer can benefit from this. Bear in mind these applications can take the hit of inaccuracy, if not benefit from it depending on the situation. Yes there are some instances were accuracy is crucial, but that's why they will continue to make both of the processors. It's what they call a free market, and there will be always be a new niche to fill.

Re:Bank balance (1)

ZombieWomble (893157) | more than 5 years ago | (#26774675)

The issue is, what does this CPU actually offer- specifically, how large would the loss in quality be? Would simply cutting down on the accuracy of the algorithms used to reduce its computational burden and thus the amount of processing power needed work just as well?

There's a dearth of hard information in the article on the chip in the article (all the headline-grabbing speedup values, but no details on exactly how much accuracy is lost), but it seems to me that such a chip would really need to conclusively prove that the accuracy trade-offs cannot be just as well made at the software level as at the hardware level.

Re:Bank balance (2, Interesting)

moderatorrater (1095745) | more than 5 years ago | (#26774373)

I disagree. Cents is already an arbitrary cut off for the calculations' accuracy, why not just cut it off at dollars? I do my budget religiously every night and go over everything with my wife a couple of times a month. If every transaction were rounded up to the nearest dollar, it wouldn't destroy my finances. I doubt it would seriously mess up my finances if they were off by $5. Now, if I were able to give up efficiency and accuracy in my financial calculations for something else that I considered valuable (for instance, lower interest in my mortgage or my credit card payments counted for 30x what they did before), then this is something that should seriously be considered.

With regards to music, they're not talking about skips and pops, they're talking about extremely slight modulations in pitch or, in the case of video, a very slight difference in color. If these differences are random and small enough (say, 300 per second), then it averages out to the same thing and our minds can't tell the difference. Hell, if the differences are small enough you wouldn't really notice them anyway. If I could get 30x the battery life out of my laptop by accepting imperfections in the video it displays and in the audio it plays (and I know it wouldn't, but this is a hypothetical), then I'd gladly go for it.

Re:Bank balance (1)

Sancho (17056) | more than 5 years ago | (#26774597)

Cutting off at the cents place isn't arbitrary--it's done because the cent is the smallest unit of currency produced by the US.

Re:Bank balance (1)

tshetter (854143) | more than 5 years ago | (#26774687)

Tell that to gas stations and their $1.909/gal price.

Re:Bank balance (0)

Anonymous Coward | more than 5 years ago | (#26774665)

But, to quote a popular saying from Britain, "A Penny saved is a penny earned"
I am sure this still applies to the Cent.
And i would consider it doubly important in this recession.

These kind of things would be better suited to things that don't require accuracy, like liquid measurements for foods (nobody ever gets it 100% perfect, so why try?)

Primality testing (2, Informative)

2.7182 (819680) | more than 5 years ago | (#26774131)

Random algorithms are used all the time. In RSA for example, random primes must be generated. This is done with an algorithm that probably gives the right answer, which is good enough. The chance that it would fail is so tiny as to not matter.

Re:Primality testing (2, Interesting)

phoenix321 (734987) | more than 5 years ago | (#26774449)

Even more so: we intentionally gather entropy to improve the pseudo random numbers. With intentionally inaccurate CPU cores, we could scrap all that and gather entropy en-passant AND be much faster anyway.

Re:Bank balance (1)

gbjbaanb (229885) | more than 5 years ago | (#26774293)

For example, in calculating a bank balance of $13,000.81, getting the "13" correct is much more important than the "81."

not to an accountant is isn't.

Besides, to take the example further: if getting the $800 right is much more important than the rest of the $800,000,000,000, does it mean no-one will care if my account suddenly goes from $2000 to $20,000?

Reminds me of... (5, Funny)

rob1980 (941751) | more than 5 years ago | (#26773973)

Q: Why didn't Intel call the Pentium the 586?
A: Because they added 486 and 100 on the first Pentium and got 585.999983605.

Re:Reminds me of... (1)

cbiltcliffe (186293) | more than 5 years ago | (#26774263)

I was thinking along the same lines.

Although considering they call this thing a probabilistic chip, maybe we'll soon have the Infinite Improbability Drive.

Re:Reminds me of... (1)

FlyByPC (841016) | more than 5 years ago | (#26774543)

I was thinking along the same lines.

Although considering they call this thing a probabilistic chip, maybe we'll soon have the Infinite Improbability Drive.

We're going to need a really hot cup of tea...

0.9999657:st post (1)

assert(0) (913801) | more than 5 years ago | (#26773977)

innit?

Accuracy with financial calculations. (5, Funny)

onion2k (203094) | more than 5 years ago | (#26773981)

Accuracy with financial calculations is extremely important. Hasn't this guy ever watched Superman 3?

Re:Accuracy with financial calculations. (2, Informative)

Briareos (21163) | more than 5 years ago | (#26774027)

Hasn't this guy ever watched Superman 3?

Maybe he just watched Office Space and missed the whole Superman 3 reference?

np: Fennesz - Vacuum (Black Sea)

Re:Accuracy with financial calculations. (0)

Anonymous Coward | more than 5 years ago | (#26774119)

Re:Accuracy with financial calculations. (1)

Firethorn (177587) | more than 5 years ago | (#26774281)

I'll note that he was making the point that the thousands part is far more important than the cents part.

Basically, it sounds like he dumbed down the answer too much. Of course the cents are important in your bank account. And, more importantly, it's fairly trivial for us to KEEP that accuracy.

Still, when you start expanding it to, say, a company's balance on the books, you tend to get errors. Think of it like a warehouse inventory - every time you do an inventory, there's a chance that somebody will count something wrong, and you'll come up with a slight error. Thing is, once that error is less than real 'noise' like shrinkage, defective products, waste, etc... It doesn't matter much and is easily compensated for. On another topic - voting ballots. You could have a 99.999% accuracy rate, but because you have millions of ballots, some will still be screwed up. In most elections, it doesn't matter, because the actual difference in votes exceeds the margin of error by a significant amount (.001% error rate, 6% difference in polling).

Multimedia wise, if we can accept some error in the least significant bits for color or sound range, which we already do with lossy compression, in exchange for faster, cheaper, and cooler/more energy efficient chips, it could have great value.

Re:Accuracy with financial calculations. (1)

peragrin (659227) | more than 5 years ago | (#26774417)

quite correct the thousands is far more important than the cents, however 13,810.00 is really close to 13,000.81 right. it has all the same numbers in a similar order.

Banks calculate out to the tens of thousands of a place simple because you need that much for rounding errors. heck in my business most items are priced out to the ten of thousands of place, getting pricing like $121.3456 for a cost.

Use in MP3 Players (2, Funny)

AmigaHeretic (991368) | more than 5 years ago | (#26773991)

So what you're saying is that it might make all my MP3s sound like they are AutoTuned? But the battery will last 30 times longer?

I guess the question is can Cher sue over this technology?

Re:Use in MP3 Players (4, Interesting)

phoenix321 (734987) | more than 5 years ago | (#26774503)

If you bought a popular artist recently, your music is autotuned already.

Anyway, this means that less than 0.1 percent of your decoded audio samples will be less 0.1 percent off. This is probably very acceptable outside concert halls and living rooms if it delivers large bonuses in battery saving or calculation speed.

For example, we could use a much beefier compression algorithm than MP3 or current algorithms even longer on even smaller devices.

Cher Patent Term Extension Act (1)

tepples (727027) | more than 5 years ago | (#26774551)

I guess the question is can Cher sue over this technology?

Only if lawmakers pass the Cher Patent Term Extension Act [kuro5hin.org] .

Old Tech? (1)

JorDan Clock (664877) | more than 5 years ago | (#26773995)

Didn't Intel already implement this technology in their Pentium 2 FPUs? Didn't seem very desirable to me...

wll, (5, Funny)

greenguy (162630) | more than 5 years ago | (#26774033)

i scrfcd accrc 4 spd a lng tm ago

Top Ten Slogans (5, Funny)

AmigaHeretic (991368) | more than 5 years ago | (#26774057)

TOP TEN SLOGANS FOR THIS NEW PROCESSOR:

9.9999973251 - It's a FLAW, Dammit, not a Bug

8.9999163362 - It's the new math

7.9999414610 - Nearly 300 Correct Opcodes

6.9999831538 - "You Don't Need to Know What's Inside" (tm)

5.9999835137 - Redefining the PC -- and Mathematics As Well

4.9999999021 - We Fixed It, Really

3.9998245917 - Division Considered Harmful

2.9991523619 - Why Do You Think They Call It *Floating* Point?

1.9999103517 - We're Looking for a Few Good Flaws

0.9999999998 - "The Errata Inside" (tm)

Re:Top Ten Slogans (4, Funny)

gbjbaanb (229885) | more than 5 years ago | (#26774353)

TOP TEN SLOGANS:

runs Excel just as well as always :-)

It seems like when you need a precise calculation (4, Insightful)

rolfwind (528248) | more than 5 years ago | (#26774071)

like financial things, you compare the end product with the end product of the same calculation run either through the chip again or another chip (or increasingly likely another core).

Still would be faster too.

Re:It seems like when you need a precise calculati (1)

phoenix321 (734987) | more than 5 years ago | (#26774559)

Repeating the process is not going to improve the results.

Suppose you repeat your calculation 3 times. How will you know that the result of comparing the three results with one another is correct?

What if the only answer you can obtain is "A equals B with a probability of 0.9998"? Recursively repeat this comparison and then compare the results? :)

Look, it's probability theory... (1)

mosel-saar-ruwer (732341) | more than 5 years ago | (#26774661)


Suppose you repeat your calculation 3 times...

If you get three different answers, then you just take the average.

You could even throw in a few extra circuits to throw out wildly anomalous values.

In fact, you could change your computer circuitry from ADD(X, Y) to something like ADD(X, Y, SD), where SD is your [hypothesized] standard deviation, and any cluster of values which appear to be generating an unacceptable standard deviation would get tossed.

Re:It seems like when you need a precise calculati (0)

Anonymous Coward | more than 5 years ago | (#26774691)

like financial things, you compare the end product with the end product of the same calculation run either through the chip again or another chip (or increasingly likely another core).

There's nothing saying the probabilities are independent of each other. It's possible that identical calculations will result in the same wrong answer multiple times, instead of a range of wrong answers.

DSP's? (2, Insightful)

Zantetsuken (935350) | more than 5 years ago | (#26774075)

Isn't that the point of using a DSP? So you can use a slower CPU to run the firmware and the DSP do that grunt work of decoding, thus letting you save power with the low voltage CPU?

My question is, if it's just as well to use a DSP, why not just use a damned DSP?

Re:DSP's? (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26774333)

Presumably, the idea is to use this technology to make good-enough DSPs that are much more efficient.

Re:DSP's? (4, Insightful)

Firethorn (177587) | more than 5 years ago | (#26774425)

I think the point would be using a DSP/math coprocessor that uses 1/30th the power in exchange for a .001% loss in accuracy for non-essential tasks like music decoding.

I mean, combined with the lousy earbuds most people use, who'd notice? Especially if it makes their MP3 player last 3 times as long as ones that use more traditional and technically accurate DSP/decoder?

Financial accuracy is foremost (0)

Anonymous Coward | more than 5 years ago | (#26774081)

Dearest depositor,

Bank is using new probalistic computers, and might be having errors. Da, you will see you are having no money now. Please to be calling bank.

Not completely correct (3, Insightful)

chthon (580889) | more than 5 years ago | (#26774099)

If you read the chapter about the history about the IEEE FP standard in Microprocessors : a quantitative approach, then you will see that in the past accuracy was already sacrificed for speed in supercomputers.

Hmmm (1)

Metabolife (961249) | more than 5 years ago | (#26774103)

What happens when you add up millions of these inaccurate accounts and end up gaining or losing millions of dollars?

Re:Hmmm (5, Funny)

BadAnalogyGuy (945258) | more than 5 years ago | (#26774133)

You get a one-way ticket to pound-me-in-the-ass prison.

Watch out for your cornhole, bud.

Re:Hmmm (5, Funny)

PitViper401 (619163) | more than 5 years ago | (#26774269)

But what about the conjugal visits?!

Financial transactions aside... (1)

suffix tree monkey (1430749) | more than 5 years ago | (#26774489)

...it would definitely make some games more fun. Imagine that at the end of the game, you meet the big boss - but instead of his usual self (object 13023), he looks like the one at 13024, a steel door.

Re:Financial transactions aside... (1)

Shadow of Eternity (795165) | more than 5 years ago | (#26774589)

Given the current state of difficulty settings and boss AI I'm not entirely sure players would notice the difference.

gfx (4, Insightful)

RiotingPacifist (1228016) | more than 5 years ago | (#26774117)

can't this be used in gfx cards, i mean with anti-aliasing and high resolutions it doesn't really matter so much if 1/2 a pixel is #ffffff or #f8f4f0 , hell you can probably even get a pixel entirely wrong for one frame and nobody will care (as long as it doesn't happen too often).

Re:gfx (2, Interesting)

retroStick (1040570) | more than 5 years ago | (#26774279)

I agree. In fact, with real-time photorealistic rendering, these slight deviations would probably make frames look more accurate, since real video is full of low-level random noise.
The film-grain shader on Left 4 Dead wouldn't be necessary any more.

Re:gfx (1)

Dogtanian (588974) | more than 5 years ago | (#26774509)

Funny you should mention that; as far as I'm aware some graphics cards *do* tolerate these sorts of minor glitches. IIRC, I heard this in connection with a /. article that discussed using GFX chips as general-use processors.

Re:gfx (1)

FlyByPC (841016) | more than 5 years ago | (#26774577)

hell you can probably even get a pixel entirely wrong for one frame and nobody will care (as long as it doesn't happen too often).

You must be new enough here to have never been fragged by a shock spell across a dark room. Flashes of light -- even small ones -- are important, at least in Oblivion.

Multimedia processor (2, Informative)

Gerald (9696) | more than 5 years ago | (#26774121)

If you're about to join the upcoming avalanche of smartass comments, try reading the UDP-Lite RFC [ietf.org] first. For some applications (notably real-time voice and video), timeliness and efficiency are more important than accuracy.

If this means my music player or phone get more battery life, I'm all for it.

Re:Multimedia processor (1)

ndogg (158021) | more than 5 years ago | (#26774347)

I'm with you there. I want to listen to my music for weeks without having to plug it in at all, especially if I'm camping.

Re:Multimedia processor (1)

mbone (558574) | more than 5 years ago | (#26774405)

Another appropriate comparison would be UDP with Forward Erasure Protection (FEC), which is a probabilistically guaranteed delivery mechanism (i.e., the data will probably arrive in full, but there is no guarantee, for times when accepting that risk is better than requiring delivery).

uhhh.... (0, Troll)

girlintraining (1395911) | more than 5 years ago | (#26774171)

NEWS FLASH: Binary consists of 1 and 0. Either the answer is right, or it isn't. "probablistic computing" is another way of saying "sloppy engineering". There is no random, only Zuul!

Re:uhhh.... (5, Informative)

Dogtanian (588974) | more than 5 years ago | (#26774393)

NEWS FLASH: Binary consists of 1 and 0.

NEWS FLASH: People use computers for calculations with more than single-digit binary results.

"probablistic computing" is another way of saying "sloppy engineering".

No- insisting on excessive precision where an "almost certainly right to within +/- x%" solution would be more than good enough and much simpler to obtain is known as overengineering.

I suspect that the financial examples chosen didn't illustrate the point as well as intended (financial companies generally don't like *any* inaccuracy), but that doesn't change the general principle.

Would you prefer a routing algorithm that gobbled up power and took ages to run for a guaranteed shortest route or one that was far more efficient and 99.9% certain to give a route that was within 3% of the shortest possible distance?

Hell, they kind of already do this (1)

NotSoHeavyD3 (1400425) | more than 5 years ago | (#26774593)

I mean a float isn't an exact number. (Actually they have that concept in the hard sciences as well called significant figures. To give an analogy that's based on a joke if a museum guard started working in a science museum 15 years ago and there was a fossil there that was 80 million years old when he started the fossil is not 80,000,015 years old.)

Re:uhhh.... (1)

phoenix321 (734987) | more than 5 years ago | (#26774619)

What if good enough really IS good enough? In a fast paced game, would you trade a tenfold increase in graphics speed for having a tiny percentage of pixels being barely noticable off the correct color?

No company will use it (0)

whitroth (9367) | more than 5 years ago | (#26774173)

When I was first studying programming, back when we had to use magnets to write 1's and 0's (ok, punch cards), we heard about the unnamed bank, somewhere in the sixties or early seventies, where a programmer had coded so that fractions of a penny were rounded down, always, and the difference showed up in an account he'd set up. He made many tens of thousands of dollars (and back then, that was real money) before he was caught. Now with orders of magnitude more transactions, there is no way any company would use this kind of probability.

And if you want to argue this, I suggest you go talk to the bean counters where you work.

        mark

Re:No company will use it (1)

BikeHelmet (1437881) | more than 5 years ago | (#26774337)

But think of the power savings! :P

Honestly, this sounds more like a DSP replacement. Most computers need exact math, but some don't.

Re:No company will use it (1)

nycguy (892403) | more than 5 years ago | (#26774403)

There are plenty of calculations in finance where only a few digits of accuracy are needed, either because a price is expressed in only a few digits (e.g., an option value being $1.46 is fine as opposed to $1.46189503 which will get rounded to $1.46 anyway) or because the inputs to the model have large errors themselves (e.g., a volatility of 21% which could just as easily be 20% or 22%, making the less significant digits meaningless anyway). These calculations also happen to be the ones that are done in real time throughout a trading day with countless cycles being wasted on unnecessary accuracy.

In a larger context, there are a ton of financial products that are valued using Monte Carlo simulations. By definition, you've already got probabilistic inaccuracy there. Why not make the individual Monte Carlo paths a bit less precise and just run a lot more of them, since it's the average that really counts?

Now, it could be that what I'm asking for here is really adaptive/selectable precision, but the point is that not all calculations need to be 15-digits accurate or even reproducable to be useful.

Re:No company will use it (1)

kwikrick (755625) | more than 5 years ago | (#26774441)

well, lets argue it a bit anyway, for fun.

If fractions are rounded up and down randomly, I mean with an equal chance, then there should be no overall loss or gain, in the long run. Of course, there might be outliers, some accounts might get shorted a bit more, some might get lucky, some people may try to game the system. Sounds just like real life. It wouldn't make such a big difference.

Still, you're right, nobody would accept such a system, because we all like the illusion of control.

Re:No company will use it (1)

Gerald (9696) | more than 5 years ago | (#26774587)

And if you want to argue this, I suggest you go talk to the bean counters where you work.

Bean counters won't use this (at least I hope not). Telecom engineers will.

Sacrifices are expected (4, Interesting)

LiquidCoooled (634315) | more than 5 years ago | (#26774177)

I have spent the last 9 months coding up a dynamic scalable UI for the nokia tablets.

I have had to make huge compromises to accuracy to obtain the desired performance.

I had the choice of using the full featured (but slow) widget sets and graphical primatives which existed already, or find a way to make it work as I expected it to.

The results have left people breathless :)

take a look here: http://www.youtube.com/watch?v=iMXp0Dg_UaY [youtube.com]

Re:Sacrifices are expected (0)

Anonymous Coward | more than 5 years ago | (#26774621)

And this is related to the story how?

Suitable for streaming media??? (2, Funny)

Anonymous Coward | more than 5 years ago | (#26774181)

I'd like to see executives at CBS explain how nipples showed ON TOP of a superbowl performer's outfit.

Talk about a wardrobe malfunction.

I can see the defense now:

Your honor: We ran probabilistic tests with out processors, and while we couldn't really duplicate the problem, we were able to show a penis during one test run. We'd really like to show it to you, but Ms. Jackson has stated that she would quote "Sue us into the ground" unquote.

Re:Suitable for streaming media??? (0)

Anonymous Coward | more than 5 years ago | (#26774419)

we were able to show a penis during one test run.

yeah, at http : // peniz.notlong.com

Nothing new here... (3, Insightful)

chill (34294) | more than 5 years ago | (#26774197)

Isn't that essentially what JPEG, MPEG and every other lossy codec or transform does?

Re:Nothing new here... (1, Interesting)

kmac06 (608921) | more than 5 years ago | (#26774235)

Yes. Except now it's done in hardware. So no.

Re:Nothing new here... (3, Interesting)

chill (34294) | more than 5 years ago | (#26774307)

There have been dedicated MPEG encoding and decoding chips for many years. DXR3 comes to mind.

I think the only new twist is applying the idea to general calculations as a whole as opposed to a specific function or set of functions in software. An interesting idea. Maybe we'll end up with double-precision, single-precision and ballpark floats.

Re:Nothing new here... (1)

John Hasler (414242) | more than 5 years ago | (#26774277)

No. Those are lossy but deterministic.

Games? (3, Insightful)

ndogg (158021) | more than 5 years ago | (#26774267)

I could definitely foresee this being used in game systems, especially for graphics.

As long as it mostly looks right, that's all that really matters.

A Safety Sticker ? (1)

mbone (558574) | more than 5 years ago | (#26774295)

So, what, future computers may come with a big sticker :

WARNING : Should not be used in life-critical calculations.

On the other hand, if the errors are really rare and random, and he can make chips 7 times faster at 1/30th the power drain, then you could array 3 such chips at 7 times the speed and 1/10 the power usage, and do your computations by majority vote.

Re:A Safety Sticker ? (0)

Anonymous Coward | more than 5 years ago | (#26774533)

Obviously, you realize that that just makes the errors 'more rare', it doesn't eliminate them, so you are back to square one (rare and random errors).

I Call BS (1, Redundant)

Shuh (13578) | more than 5 years ago | (#26774297)

Bits are bits. There's no guarantee that a "low power" chip will only mess up the low-order bits of a number.

What is this guy smokin'?

Re:I Call BS (1)

Oidhche (1244906) | more than 5 years ago | (#26774487)

More importantly, how do you ensure that only data is messed up? Flipping a bit of data might be acceptable. Flipping a bit of code probably isn't.

Re:I Call BS (0)

Anonymous Coward | more than 5 years ago | (#26774633)

There's no guarantee that a "low power" chip will only mess up the low-order bits of a number.

That was neither implied nor said anywhere. You're a moron.

Re:I Call BS (1)

Dogtanian (588974) | more than 5 years ago | (#26774669)

There's no guarantee that a "low power" chip will only mess up the low-order bits of a number.

Yeah- you spotted the obvious issue that those guys never *ever* would have taken into account before starting work on their design. I hereby nominate you for the Nobel prize.

The article really doesn't make clear how the chip works, what the probabilities are, etc. so if you're able to come to that conclusion you either know more about it than me, or you're just sitting typing pat answers into Slashdot.

Oh, and if you want to be pedantic there's no *guarantee* that *any* computer will work correctly- it could get bombarded by enough cosmic rays to generate uncorrectable memory errors.

Cool for my entertainment (1)

hackingbear (988354) | more than 5 years ago | (#26774301)

Wow... soon, I will be able to comprehend music and video at 7x the standard playing speed! Wonderful!

My bank yould love this (2, Interesting)

Celc (1471887) | more than 5 years ago | (#26774329)

I'm sure my bank would love to argue they at least got the 13 right as they skim a penny of every transaction. I'm sure this the most awesome thing since sliced bread, but can we please avoid trying too argue this from the point of peoples bank accounts when it introduces random error.

What a waste of grant money... (2, Insightful)

gavron (1300111) | more than 5 years ago | (#26774335)

Processors that provide different output for the same input cannot be used for anything that wants predictable output.

They can not be used for ANY result that is later used by anything else -- after all, data based on bad data yields bad results.

Next thing you know, some offshore manufacturer will use the "imprecise" (cheaper) chips instead of the "accurate" ones, and simple things we depend on everyday will fail in wonky ways.

A bit-flip on a microwave will make a 30-second timer not expire at 0, and keep going at 99:99 and burn the food.

A bit-flip on a home heating circuit will make 70F appear to be a target heat of 6F and never turn on.

A bit-flip on an MP3 player would have it skip from 65 seconds into the song to 134 second into the song.

These are all results if just ONE BIT were to flip JUST ONCE. The processor described would UNPREDICTABLY and RANDOMLY do worse than actually flip one bit.

What's next -- they'll put three processors in each device and when two of them agree they'll go for it? "Voting Processing" is bs.

E

No Ponzi scheme (0)

Anonymous Coward | more than 5 years ago | (#26774341)

That's actually what Ponzi and his descendents tried to do. It was just little imperfections with 0s and 1s: a little accuracy sacrificed for the quantity of clients. You can't blame them, it's the progress damn it!

where art and science meet, perhaps? (1)

untorqued (957628) | more than 5 years ago | (#26774375)

This reminds me of the introduction to Samuel R. Delany's The Motion of Light in Water, where he talks about his admittedly faulty memory colliding with a biographer's researched facts. He concludes a long explanation with "...the wrong sentence still feels to me righter than the right one."

No, this technology isn't appropriate for financial transactions. But anywhere that randomness could open the door to unexpected results that shed new light on something, I think this could be pretty exciting.

Re:where art and science meet, perhaps? (1)

gavron (1300111) | more than 5 years ago | (#26774451)

Art has nothing to do with science.

Science is the application of the scientific method.

> I think this could be pretty exciting

Pick up a Magic 8-ball or a pair of dice or do numerology on the next /. post following this one. Those are all equally random, and by your definition, exciting.

Excitement should not come from "OH MY GOD WE'RE ALL GOING TO DIE" but rather from something positive. I don't want my planes, trains, automobiles, bank balances, MP3 players, GPS receiver, microwave, timed lights, alarm system, fuel-injection computer, television, DVR, or anything else providing me with extra excitement. I want them all to do EXACTLY what their specs say they will do.

E

19 times out of 20 (1)

RichMan (8097) | more than 5 years ago | (#26774383)

Don't forget the probability is curve. You can design it so most of the time the lost accuracy will be where you don't care about it (the cents). But if a million people are banking, then probability says some of them will have significant errors (in the thousands column).

This is not good for
a) firing a missile
b) driving a car
c) designing a bridge or building
d) my bank balance

General computing.

Re:19 times out of 20 (1)

gavron (1300111) | more than 5 years ago | (#26774495)

There is no bank which will run a system where 1/20 times they will have a 50% chance of losing money.

Remember, it's not about the "probability curve" but rather that there are two sides to every equation. If you, THE USER, are willing to accept the chance of an error in your favor or against you, that doesn't mean THE OTHER SIDE will. Banks have to be accurate.

Let's use another example, that fabled MP3 player where SOMEHOW MAGICALLY the processor knows whether it's being asked to decode an MPEG 1 Layer 3 frame Vs. it's being asked to calculate the next frame to retrieve. (There's no way for it to know.) So let's say that you're willing to accept an erroneous decoding of that frame. Ok. Perhaps the next frame also will be erroneously decoded. After all, it's like dice, EACH roll has that probablistic potential of being wrong.

The artist who recorded the work, the producer who put it on the medium, and the distributor who sold it to you -- all of them DON'T WANT that lack of probablistic potential to degrade their work.

So let's go from the other direction... WHICH APPLICATION wouldn't mind if it randomly did random things. You know, a simple IF-THEN will evaluate wrong, or a simple GOTO will go to somewhere else (or not go to at all) or a WHILE...DO will quit before being done...

I have no idea which application (either software or real life institution) would go for this.

E

The first thing that comes to mind... (3, Insightful)

Lordfly (590616) | more than 5 years ago | (#26774423)

...is gaming applications.

Programmers spend a lot of time coming up with algorithms that simulate randomness for AI or cloud generation or landscapes or whatever... if the processor was just wonky to begin with it'd make certain things a lot more natural looking.

It's interesting that AI in games is always touted as being "ultra-realistic", but always ends up being insanely easy to trip up. Having something "close enough" would add just enough realism/randomness to situations to perhaps make games and environment more dynamic.

I wouldn't want these things processing my bank balance, though, unless it rounded up.

A Little Bit of History Repeating (2, Interesting)

lobiusmoop (305328) | more than 5 years ago | (#26774427)

So basically he's advocating fuzzy logic [wikipedia.org] , which was big in AI research in the 80's?

Well, it's a *little* more important (2, Interesting)

roystgnr (4015) | more than 5 years ago | (#26774481)

For example, in calculating a bank balance of $13,000.81, deliberately risking getting the "13" incorrect is fraud that risks $13,000 in damages and $1,000,000 in statutory penalties, and risking getting the "81" incorrect is fraud that only risks $0.81 in damages and $1,000,000 in statutory penalties. Surely saving a couple watt-microseconds is worth that!

wrong target (1)

800DeadCCs (996359) | more than 5 years ago | (#26774507)

Anyone who used this for money would (and should) be skinned alive.
It's the wrong thing to target.

If it's not exactly accurate, but instead centers around averages, (how well does it converge towards the REAL answer?) then I could see uses for it, but still in parallell with normal procs for a rough "checksum"...

CGI and various modelings where you just need to pour power at it for example.

Maybe make some kind of "accelerator" board with it to be used for certain apps.
(Still pissed about not having a reasonably priced cell board like that).

GREAT! fresh bugs (0)

Anonymous Coward | more than 5 years ago | (#26774515)

all we need is a fresh bunch of hardware bugs in our computing society, and just when we got to killing some of them from our software.

I finally understand (1)

BubbaDoom (1353181) | more than 5 years ago | (#26774563)

The answer 42 now makes perfect sense.

hopeless article (1)

kwikrick (755625) | more than 5 years ago | (#26774571)

No information in TFA whatsoever, only a rather bad example.

Probabilistic algorithms are very useful in computer science, and introducing uncertainly at the hardware level might be very interesting for some applications. But how did they implemented this? How do they save power? How are they faster than normal processors?

Bah. I hate non articles like this.

     

So, to get it right... (1)

Presence1 (524732) | more than 5 years ago | (#26774595)

... they would need to use a crowd of these processors and some kind of "wisdom of crowds" algorithm to figure out which of the output values is good.

So, in rough figures, if 30 processors is enough to get a good reliable answer from the 'crowd' of procesors, and the overhead of the "wisdom of crowds" algorithm is less than 14%, then maybe we have a system that uses the same power and is about 6x as fast, but no power savings.

If a less good answer is acceptible, then maybe only a few processors are necessary, and there is a net power savings. If it takes more than a crowd of 30 to make a completely reliable answer, then it costs power, but we still get a faster system.

It could be useful for certain applications, if carefully applied. In addition to the media playback and gaming apps already mentioned, it could be good in robotics, where speed amd low power count for a lot, and the feedback loops do a lot of successive approximation anyway.

Porn? (1)

gravis777 (123605) | more than 5 years ago | (#26774609)

Yeah, cause no one will care if you try to log into godtube and instead get horse-on-girl porn

Or, because the suffix is unimportant, I get whitehouse.com instead of whitehouse.gov

Cause you all know I must have a processor 7 times faster than what is in an iPhone or blackberry to watch video streaming over 3g on a 2 inch screen

The lower power consumption might be a plus, though

Welcome to Bizarro World (0)

Anonymous Coward | more than 5 years ago | (#26774611)

I don't know if the Professor or the reporter wrote the example, but they must inhabit some alternate bizarro world.

I know that if my bank account was supposed to be $13,000.81 but my bank came up with $13,000.57 instead, I'd be massively upset.

Moral: Always pick a reasonable analogy to describe your new idea, otherwise you'll get thrown in with all the other crackpots with idiot ideas.

for simulation (2, Insightful)

kilraid (645166) | more than 5 years ago | (#26774615)

For simulation, Monte Carlo and such statistical sampling this would be perfect. There is already random error - sampling error - so adding a lesser source of error - computational error - while reducing the first would make sense if the computations can be sped up.

It makes sense to me (0)

Anonymous Coward | more than 5 years ago | (#26774625)

All processors have errors and systematics. Just perform complicated calculation in two computers and the results won't be identical, although statistically very similar.

However, there is no word in the article about how faulty the "new" processor is. He says that it would consume 1/30th of power and be 7 times faster. Let's go for probabilistic computing! Put 30 processors and gain 210 times in speed, make the calculations in each processor and make statistics.

It doesn't seem to me a bad approach.

Obligatory NASA research (5, Funny)

DirePickle (796986) | more than 5 years ago | (#26774639)

From NASA [b2b2c.ca] :

Computer scientist Arthur Boran was ecstatic. A few minutes earlier, he had programmed a basic mathematical problem into his prototypical Akron I computer. His request was simply, "Give me the sum of every odd number between zero and ten." The computer's quick answer, 157, was unexpected, to say the least. With growing excitement, Boran requested an explanation of the computer's reasoning. The printout read as follows: THE TERM "ODD NUMBER" IS AMBIGUOUS. I THEREFORE CHOOSE TO INTERPRET IT AS MEANING "A NUMBER THAT IS FUNNY LOOKING." USING MY AESTHETIC JUDGEMENT, I PICKED THE NUMBERS 3, 8, AND 147, ADDED THEM UP, AND GOT 157.

A few moments later there was an addendum: I GUESS I MEANT 158.

Followed shortly thereafter by: 147 IS MORE THAN 10, ISN'T IT? SORRY.

Article devoid of content (0)

Anonymous Coward | more than 5 years ago | (#26774679)

This article doesn't in any way explain this concept. Seems like it may potentially have merit, but no detail is given as to the implementation. The IEEE floating point representations already trade off accuracy for speed/efficiency. So, surely that isn't what is being claimed.

The article implies that noise in some bits is more significant than in others and that this leads to increased energy expenditure, but doesn't explain how this is leveraged.

Does the proposed processor perhaps ensure high precision for high-order bits, and less precision for low-order bits in multiplications and additions, say? That leaves a lot of open questions for other kinds of instructions.

Obviously, that approach wouldn't work for the vast majority of existing control systems in software, which require binary answers, but it might be compatible with a looser logic system like fuzzy logic. That requires supporting infrastructure that isn't mentioned, however.

It's possible the errors are rare or correctable, as has been developed in quantum computing, but this point isn't developed either.

It's also possible, as in one earlier comment I read that this is some form of variable-precision, but that would need some explaining as well.

Have Both With Ninnle! (0)

Anonymous Coward | more than 5 years ago | (#26774685)

The legions of Ninnle Linux users can tell you that, with Ninnle Linux, you can have your cake and eat it too. Speed, accuracy, flexibility and security all in one package. Join the Ninnle Revolution today!
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?