Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Moore's Law Is Becoming Irrelevant, Says ARM's Boss

Soulskill posted about a year and a half ago | from the not-useful-anymoore dept.

Microsoft 236

holy_calamity writes "PCs will inevitably shift over to ARM-based chips because efficiency now matters more than gains in raw performance, the CEO of chip designer ARM tells MIT Technology Review. He also says the increasing adoption of ARM-based suppliers is good for innovation (and for prices) because it spurs a competitive environment. 'There’s been a lot more innovation in the world of mobile phones over the last 15-20 years than there has been in the world of PCs.'"

cancel ×

236 comments

Sure... (0)

Anonymous Coward | about a year and a half ago | (#41935657)

...and by this, Intel's fab advantage will eventually make ARM irrelevant.

Re:Sure... (2)

ackthpt (218170) | about a year and a half ago | (#41935701)

...and by this, Intel's fab advantage will eventually make ARM irrelevant.

Simply optimizing code could do that

Re:Sure... (1)

NatasRevol (731260) | about a year and a half ago | (#41936321)

'Simply', huh?

Duh (5, Insightful)

Anonymous Coward | about a year and a half ago | (#41935659)

CEO of a company that makes more efficient CPUs than the competition says the future is in efficient CPUs. News at 11.

Re:Duh (1)

Anonymous Coward | about a year and a half ago | (#41935817)

They don't they make CPUs. They sell licenses. Let's see in a few months if ARM CPUs are really more power efficient than Intel products.

Re:Duh (2, Insightful)

NatasRevol (731260) | about a year and a half ago | (#41936329)

Pretty sure they will be, since the are now and have been since ... forever?

Re:Duh (2, Informative)

Anonymous Coward | about a year and a half ago | (#41936749)

Except Intel CPUs have been becoming far more power efficient over the last few years too. I recently replaced my old Pentium-4 Windows PC with a new i7-based PC. The P4 has one core, runs two threads, is rated at around 130W and, when playing games, the system sounds like a jet engine. The i7 has four cores, runs eight threads and is rated at 77W. When playing games I can hardly hear it under my desk and the air coming out the back is barely warm.

Re:Duh (0)

Lucky75 (1265142) | about a year and a half ago | (#41936817)

Of course, you're not playing games on the i7 without a graphics card, because their integrated solution sucks.

Re:Duh (3, Interesting)

K. S. Kyosuke (729550) | about a year and a half ago | (#41936779)

Since 80386. :-) I wonder what would have happened had Archimedes actually succeeded in the market. In those days, it was faster, cheaper, more energy-efficient than the 80386 PCs. Perhaps a better performance differential than just those 3x (?) would have helped. Or perhaps not.

Re:Duh (1)

Luckyo (1726890) | about a year and a half ago | (#41936477)

"That design..."

I think we both understood the actual meaning of GP.

Re:Duh (1)

K. S. Kyosuke (729550) | about a year and a half ago | (#41936741)

CEO of a company that makes more efficient CPUs than the competition says the future is in efficient CPUs. News at 11.

OR, it's the other way round: the people currently at the helm thought around 1990 that the future would be in efficient CPUs and so they formed an efficient CPU company. Then, they just kept their point of view.

That must explain their popularity. (0)

Anonymous Coward | about a year and a half ago | (#41936793)

I guess that's why all the low-power Pentiums with two cores and no hyperthreading have about ten or twenty reviews on Newegg, and all the Core I7s that score 5x higher on Passmark and use 3x as much power, while costing several times more, have hundreds of reviews.

More efficient processor (Pentium G630), 18 reviews: http://www.newegg.com/Product/Product.aspx?Item=N82E16819116406
Less efficient processor (Core i7-3770K), 357 reviews: http://www.newegg.com/Product/Product.aspx?Item=N82E16819116501

On Amazon it's the same pattern, with 47 reviews for the bitchin' fast processor and 7 for the futuristic low-power one.

I'm also noticing that the difference between the best GPU/CPU and the second-best model is a margin of 30-40% on a good day, just like it has been for the last decade.

LOL (0, Insightful)

Anonymous Coward | about a year and a half ago | (#41935671)

Guy who works for company thinks said company's product will dominate product of other companies: Film at 11.

Well, at least he didn't compare things to cars. (0)

Anonymous Coward | about a year and a half ago | (#41935677)

Then GM and Ford would come up with some witty comeback about crashing several times a day.

Bill Gates's friend Jerry may have something to say.

Not built for speed?!? (4, Interesting)

ackthpt (218170) | about a year and a half ago | (#41935689)

But every newer version of operating systems has more bloat than ever. There must be some corollary to Moore's Law which states successive Operating Systems will still require higher performance, but users will now become accustomed to slower response times.

We could call it the Blort Law.

Re:Not built for speed?!? (4, Informative)

GrumpySteen (1250194) | about a year and a half ago | (#41935759)

Wirth's Law [techopedia.com] :
Software is getting slower more rapidly than hardware is getting faster.

Re:Not built for speed?!? (-1, Flamebait)

poetmatt (793785) | about a year and a half ago | (#41935957)

probably 50-75% of the reason for this is the focus on DRM and security. Other OS's outside of windows do not experience this type of bloat shit. It is marked as a regression and fixed.

Re:Not built for speed?!? (4, Insightful)

atlasdropperofworlds (888683) | about a year and a half ago | (#41936053)

Let's look at a couple of real examples: Win 7 has fewer features, a smaller memory footprint, and ran faster than Vista, but offered stronger security. Win8 is faster (or at least the same speed) and has a smaller memory footprint than Win7, and has further upgraded security features. I'm not feeling where you're coming from.

Re:Not built for speed?!? (2)

K. S. Kyosuke (729550) | about a year and a half ago | (#41936789)

How much flash memory does a Win 8 installation take on a tablet?

Re:Not built for speed?!? (1)

Anonymous Coward | about a year and a half ago | (#41936849)

Oh, so you read the story about how WIndows 8 uses lots of flash memory on a tablet the other day, and now you're parroting that information to show how clever you are.

Re:Not built for speed?!? (1)

drakaan (688386) | about a year and a half ago | (#41936107)

DRM, yes...security, no. Securing your code (making it not fail under the weight of random exploits) doesn't slow things down. Adding in additional complexity, holes, and latency to your software stack with DRM definitely slows things down.

Re:Not built for speed?!? (2, Insightful)

geekoid (135745) | about a year and a half ago | (#41936211)

" Securing your code (making it not fail under the weight of random exploits) doesn't slow things down."
of course it does. Checks take resources.

"Adding in additional complexity, holes, and latency to your software stack with DRM definitely slows things down."
also true

Re:Not built for speed?!? (0)

Anonymous Coward | about a year and a half ago | (#41936493)

OK. Can you explain how DRM "slows things down" when you abstain from playing DRM media?

Gate's Law (5, Funny)

Citizen of Earth (569446) | about a year and a half ago | (#41935777)

It's called Gate's Law: Every 18 months, the speed of software halves.

Re:Not built for speed?!? (0)

Anonymous Coward | about a year and a half ago | (#41935793)

But every newer version of operating systems has more bloat than ever.

Nope. Well, this may be true for windows, but windows is not the only OS around. Linux still works with small processors and little memory. At least the OS. I run bigger apps on linux these days, but only because even small machines have lots of power nowadays. The os itself does not need much.

Re:Not built for speed?!? (1)

Anonymous Coward | about a year and a half ago | (#41935913)

I would like to inform you that Linux is NOT an operating system!!! What you are probably referring to is GNU/Linux (or GNU + Linux)

Re:Not built for speed?!? (0)

Anonymous Coward | about a year and a half ago | (#41936859)

Neither of those things are an operating system.

Sincerely,
Beastie

Re:Not built for speed?!? (0)

Billly Gates (198444) | about a year and a half ago | (#41935973)

Linux is the other way around. YEars ago Linux has battery and power saving metrics that put WIndows to shame.

Today WIndows 8 runs on 9 year old Pentium IV with ease. Windows Vista can't even run on these. Windows 7 could but not as well as XP unless it was loaded with ram. Windows 8 can.

MacOSX 1.0.0 - 10.3 was slow as hell. 10.6 was fast enough to run on the colored iMacs taht 10.1 could not, hence these users stayed on MacOS classic.

Re:Not built for speed?!? (0)

Anonymous Coward | about a year and a half ago | (#41936345)

Today WIndows 8 runs on 9 year old Pentium IV with ease. Windows Vista can't even run on these. Windows 7 could but not as well as XP unless it was loaded with ram. Windows 8 can.

Why do I have a hard time believing that since Win8 has a much smaller pool of available drivers since Vista? My understanding is Win8 specifically focuses on current and newer generation hardware. And the further one goes back in time, the less likely it is to be supported, unless its all name brand and extremely well established brands/chipsets.

Something tells me, that's wishful thinking on your part.

Re:Not built for speed?!? (2)

Cinder6 (894572) | about a year and a half ago | (#41935993)

Is it true with Windows, even (anymore, at least)? Windows 7 is faster than Vista, and 8 is equal to or faster than Windows 7 in most cases. See: http://www.techspot.com/review/561-windows8-vs-windows7/page2.html [techspot.com]

According to Engadget, PCMark 7 has a bug that causes Windows 8 to score lower than it should.

Re:Not built for speed?!? (1)

JDG1980 (2438906) | about a year and a half ago | (#41936161)

Nope. Well, this may be true for windows, but windows is not the only OS around.

Even on Windows, this is not true. Windows 7 is faster and less bloated than Vista, and on par with XP. Supposedly Windows 8 is smaller still, though even on a diet, Windows can't compete with purpose-built portable OSes like Android and iOS when it comes to efficiency.

Re:Not built for speed?!? (1)

Billly Gates (198444) | about a year and a half ago | (#41935923)

Windows and MacOSX have become less bloated than the previous iterations.

WIndows 7 is much faster and leaner than Windows Vista, which Windows 8 is so lean it puts WIndows 7 to shame on ancient Pentium IVs! Linux has regressed sadly and Windows went through stages of regression but has improved starting with Windows 7.

Slower response times are execuses by those who do not adopt well to change. My hunch is you are an XP user if I were a betting man. ;-)

Re:Not built for speed?!? (0)

Anonymous Coward | about a year and a half ago | (#41936041)

WIndows 7 is much faster and leaner than Windows Vista

Any scientific data to support that assertion? (For me Vista was and is fast enough. I see no difference between 7 and Vista installation)

Re:Not built for speed?!? (1)

Billly Gates (198444) | about a year and a half ago | (#41936143)

WIndows 7 is much faster and leaner than Windows Vista

Any scientific data to support that assertion? (For me Vista was and is fast enough. I see no difference between 7 and Vista installation)

How about here [maximumpc.com] ?

That was 3 years ago and hardware is much newer now so XP would likely be very laggy. Compared to Vista, Windows 7 trounces it HUGELY! 50% better network performance and 25% with HD access. Vista really is a terrible OS whether the service packs fixed many issues or not.

Re:Not built for speed?!? (0)

Luckyo (1726890) | about a year and a half ago | (#41936569)

No idea about hypotheticals and synthetic benchmarks, but in practice both this gaming machine I'm typing it on (2500k, 4 gigs of ram and 560ti) and my previous computer (core2duo, 4 gigs of ram and radeon 4870) run significantly faster when booted to XP then 7.

Unfortunately XP has no dx11 support, so 7 is the main OS on this one. It runs noticeably slower under 7 then when I boot XP on it and it even runs slower under 7 then my old machine running XP.

Anectodal, but real comparison across two different computers. I'll trust my own senses over hypothethicals and benchmarks.

Re:Not built for speed?!? (0)

Anonymous Coward | about a year and a half ago | (#41936895)

Paul Blort Mall Lawyer approves this message.

Title is rubbish (4, Informative)

YodasEvilTwin (2014446) | about a year and a half ago | (#41935691)

The guy says nothing of the sort, it's just the title of the article. All he says is that efficiency is becoming more and more important, and that ARM offers such efficiency. (He *also* says that ARM can offer performance as well.)

Re:Title is rubbish (0)

Anonymous Coward | about a year and a half ago | (#41935721)

This is \. You're not supposed to RTFA!

Re:Title is rubbish (1)

Annirak (181684) | about a year and a half ago | (#41935771)

Back-slash-dot? Is that where you don't bother reading TFA? I read before I comment, anyway.

Re:Title is rubbish (0)

Anonymous Coward | about a year and a half ago | (#41935855)

It's just his cowardly subconscious telling him to escape.

Re:Title is rubbish (0)

Anonymous Coward | about a year and a half ago | (#41936385)

Or a dorky windows user.

Re:Title is rubbish (1)

camperdave (969942) | about a year and a half ago | (#41936633)

\. is a "tech" site for Microsoft fans. "News for Windows Nerds. MS_Stuff that matters."

Re:Title is rubbish (1)

jedidiah (1196) | about a year and a half ago | (#41935801)

> and that ARM offers such efficiency

Big deal.

Intel has been getting it's act together in this regard. So this advantage of ARM isn't so great anymore. Meanwhile, you still do have the massive performance gap between x86 and ARM should you decide to do something besides browse LOLcats.

If anything, it's AMD that's lagging behind here.

Re:Title is rubbish (4, Insightful)

Hatta (162192) | about a year and a half ago | (#41936037)

Smaller transistors can be operated with less current, so Moore's law remains as relevant as ever.

Re:Title is rubbish (0)

Anonymous Coward | about a year and a half ago | (#41936303)

Smaller transistors also have lower switching losses but are leaky as fuck, so you really want to race-to-idle and gate off power to as much of the chip area as possible.

Re:Title is rubbish (3, Informative)

geekoid (135745) | about a year and a half ago | (#41936309)

What?
twice the transistors, half the price. That is what Moore's law boils down to, according to his paper. Read it.

And yes, it's not relevant for a number of reasons.
As a real world example:
In 06 you could get a 3 GHz computer. If Moore's law still impacted speed, we would be able to get a 24GHz chip right now.

disturbing (0)

Anonymous Coward | about a year and a half ago | (#41935719)

If this guy gets his way, then we may never have sentient computers.

At least keep Moore's Law alive through 2025.

-John Henry

Re:disturbing (1)

Andy Prough (2730467) | about a year and a half ago | (#41935959)

And we might not ever get to enter Tron.

Moore's "Law" (1)

globaljustin (574257) | about a year and a half ago | (#41936695)

I'm *very* glad you made your comment...it highlights a serious mistake in computing...a mistake that costs **BILLIONS** and effects people directly...

"If this guy gets his way, then we may never have sentient computers."

The idea that Moore's "Law" is somehow a scientific predictor that indicates the ability and future inevitability of humans making 'sentient' computers is...well...it's ridiculous.

1. Computers execute instructions...humans are beyond that, we have *free will*...For humans to make 'sentient' computers the system would by definition cease to be a computer. Something that has *free will* is completely different from something that just **follows instructions**

We can never 'make' a sentient computer...and even if we **could**....

2. Moore's "Law" wouldn't predict when it would happen at all, because it is annecdotal, not quantitative in nature. It shows an **interesting trend** that could **become** a predictive 'law' if someone made a theory of it and tested it.

So there you have it. IMHO, geeks create fictions by transmogrifying anecdotal data into quantitative data...why? Simple, quantitative data has less uncertainty, and therefore is easier for a 'geek-minded' person to sythesize

however, anecdotal evidence must be seen and analyzed for what it is...continuous, non-quantitative, noisy, and information-rich

Efficiency! (5, Interesting)

CajunArson (465943) | about a year and a half ago | (#41935735)

" efficiency now matters more than gains in raw performance"

Sure, so why don't you start off by telling us why an Exynos Cortex A-15 chip running a web benchmark is using about 8 watts of power, with the display turned off so only SoC power is being measured, while Intel has already demoed a full-blown Haswell running Unigine Heaven at... 8 watts.

So when the miraculous Cortex A-15 uses the same amount of power as the supposedly "bloated" x86 Haswell, while Haswell is running a benchmark that is massively more intensive than a web-browser test, who is really making the most "efficient" platform?

Exynos Source: http://www.anandtech.com/show/6422/samsung-chromebook-xe303-review-testing-arms-cortex-a15/7 [anandtech.com]
Haswell Demo Video: http://www.youtube.com/watch?v=cKvVdhkgAxg [youtube.com]

Re:Efficiency! (3, Informative)

dgatwood (11270) | about a year and a half ago | (#41935947)

That's a false comparison, though. If users mostly ran benchmarks 24x7, that would be a good test of efficiency. The reality, however, is that CPUs mostly sit idle, so to compute average efficiency, you have to factor that in.

Granted, a faster CPU that can reach an idle state sooner can be more efficient than a slower CPU that runs at full bore for a longer period of time, but only if the idle wattage is fairly similar.

Re:Efficiency! (2)

atlasdropperofworlds (888683) | about a year and a half ago | (#41936115)

I agree, and I'd put my money on Intel reducing idle wattage faster than ARM increasing performance.

Re:Efficiency! (2)

CajunArson (465943) | about a year and a half ago | (#41936195)

Good thing then that Haswell's idle power draw is 20x better than Ivy Bridge's, meaning that it is probably about the same as the Cortex A-15 (or maybe even better).

I'm not saying that Haswell belongs in a smartphone.. I'm also saying that unless you downclock that Exynos you don't want it in a smartphone either. I *am* saying that the blind assumption that ARM == efficiency tends to disintegrate when confronted with facts. I'm also saying that if Haswell can run at 8 watts, the whole "x86 wastes powar!" line is going to sound pretty silly when next-generation 22nm Atoms show up.

Re:Efficiency! (1)

Pulzar (81031) | about a year and a half ago | (#41936015)

Sure, so why don't you start off by telling us why an Exynos Cortex A-15 chip running a web benchmark is using about 8 watts of power, with the display turned off so only SoC power is being measured, while Intel has already demoed a full-blown Haswell running Unigine Heaven at... 8 watts

Wait, wait... are you trying to say that in a notebook system doing wireless web surfing, the only sources of power are the CPU and the display?

If so, you are way off.

Re:Efficiency! (2)

CajunArson (465943) | about a year and a half ago | (#41936109)

No, I'm saying that on a chromebook with a SoC (that stands for "system on a chip" you know...) the total power consumption of the SoC running a web benchmark that likely requires little or no wireless network power due to caching is equivalent to the power consumption of a low-power Haswell part (that is similar to a SoC but with a separate south-bridge MCM).

Oh, and if the Kraken benchmark is anything remotely similar to any other web browser benchmark I've ever seen, the CPU/GPU on the SoC are not being taxed at 100% for the whole time the benchmark is running, so an *average* power consumption of 8 watts is even *worse* than it looks compared to the Heaven benchmark which is one of the most CPU & GPU intensive benchmarks available anywhere.

Thanks for making me think about the comparison a bit more, it looks like I was being a little too nice to ARM when I said the power consumption was "equal" when in fact it is likely skewed in ARM's favor due to the lighter workload on the ARM chip that would consume even more power if truly stressed.

Re:Efficiency! (-1, Troll)

Baloroth (2370816) | about a year and a half ago | (#41936157)

Holy shit, what is that, 2 FPS in that Haswell demo? That's a slide-show, not a demo. Sure, it can run Unigine at 8 watts, but if you need 80 watts to actually make that smooth, I'm not impressed. Merely being able to run a demo like that isn't impressive. Plus, you know, CPU performance != graphics performance, so you are comparing the proverbial apples to oranges anyways.

Re:Efficiency! (1)

CajunArson (465943) | about a year and a half ago | (#41936261)

Really.. please show me the same demo running on... let's say... the A6 on the newest iPad. I'm sure it will get *much* better framerates.... (or not).

You completely missed the point of my post, but I can see that Intel is gradually working its way up Ghandi's list. It's getting to be between the "then they laugh at you" and the "then they fight you" stages....

Re:Efficiency! (0)

Anonymous Coward | about a year and a half ago | (#41936353)

He was clearly talking about the Ivy Bridge demo, which was jerky and 17 watts, and not the smooth Haswell at 7.8 watts. The guy didn't watch the entire video before running his mouth off.

Re:Efficiency! (1)

QuantumRiff (120817) | about a year and a half ago | (#41936657)

Wonder what the cost difference is between those two.. If you're putting it in a low cost device, a difference in price can be rather significant.. (ie, do I spend $30 more on each CPU, or go with the cheaper CPU, and $20 worth of extra battery)..

Oh, wait.. one is out in production. another has no firm release date.. So a brand new, not yet actually in use chip is faster, and uses less power than one that has been around a while.. Fascinating...

Hate the "Post-PC" era (5, Insightful)

Anonymous Coward | about a year and a half ago | (#41935739)

As a geek I love a powerful general purpose machine that can do all the things an ebook reader/music player/web browser can do AND a whole lot more like play 3d games, run a math or science simulation, allow you to record and edit video, memory and processor intensive image editing. To me a tablet is little more than a crippled PC with the keyboard removed (fantastic, why did I learn to type at 90wpm again??), and a smudge screen interface (hate viewing photos through finger marks!!!). It's really awesome that we have dumbed down our computers to the point of mediocrity. Even finding a decent e-book reading or music playing app - the things these pieces of shit are touted at making easier - is a nightmare. So many book readers don't even let you zoom on images. And browsing the web without flash support is like trying to surf with one leg. I don't mind that there are dumbed down idiot boxes for those who like to post pictures of food on Facebook, but I really resent the impact on general purpose computing.

Re:Hate the "Post-PC" era (2)

shadowrat (1069614) | about a year and a half ago | (#41935775)

buy a raspberry pi, if you really are a geek.

Re:Hate the "Post-PC" era (0)

Anonymous Coward | about a year and a half ago | (#41935997)

Or an IBM Power7. Those boxes leave everything else in the dust. Damned expensive, though, and it's no fun buying a cheaper, used Power box, because by then it's no faster then an x86.

Re:Hate the "Post-PC" era (1)

afidel (530433) | about a year and a half ago | (#41935891)

Really, you hate the fact that the mobile core i5 is more powerful than the previous generation while allowing all day battery life? Because that's the biggest way that tablets have affected general purpose computing that I can see. Sure, the current mobile i5 isn't going to transcode video as fast as a current desktop i7, but it'll do it considerably faster than a Core2 era desktop. Plus optimizing idle power is good for the environment, replacing P4 era desktops with current era machines will save you tons of money and drastically reduce emissions. It's not like you can't buy a $1,000 screaming fast desktop today, most people just don't need to.

As to the media playing thing, I haven't had any problems on my Android tablets using strictly free apps from the marketplace.

Re:Hate the "Post-PC" era (1)

dmacleod808 (729707) | about a year and a half ago | (#41935907)

I pretty much prefer to browse the web without flash support. Steve Jobs was right. also I type too fast for slashdot.

Re:Hate the "Post-PC" era (0)

Anonymous Coward | about a year and a half ago | (#41935933)

If you like having to sit in one place while a large computer (read: anything bigger than what would fit easily in a pocket) serves up web pages, music, ebooks, etc then there will always be products for you! Not to worry. For those of us that appreciate being able to do low-involvement tasks while not sitting at our computer, tablets and handhelds are a godsend. My couch is way more comfortable than my desk chair.

Re:Hate the "Post-PC" era (0)

Anonymous Coward | about a year and a half ago | (#41935977)

Even the embedded style devices are pretty fast. Just get used to Linux running a more traditional desktop UI. There will always be geeks who want to be more "poweruser" oriented. Open source means that so long as there is demand people will keep those things going. Gnome3, Unity, and now Windows 8 have all gone crazy with their UI's, but as I'm running XFCE and am perfectly happy with it.

could not agree more (1)

Andy Prough (2730467) | about a year and a half ago | (#41936097)

Please Intel, keep making those big, inefficient chips.

Re:could not agree more (1)

HarrySquatter (1698416) | about a year and a half ago | (#41936411)

Yeah, if you ignore Ivy Bridge and Haswell.

Makes no sense! (3, Insightful)

Anonymous Coward | about a year and a half ago | (#41935773)

Moore's law just predicts transistor density - it says absolutely nothing about computational power. Increases in transistor density can make electronics more efficient per watt, but this still is aligned with Moore's law.

The title is stupid, and the actual article says almost nothing like it.

Re:Makes no sense! (2)

ebunga (95613) | about a year and a half ago | (#41935949)

Actually, this means that the CEO of ARM doesn't know what a transistor is and why you would want more transistors in a tiny space.

Power (4, Insightful)

rossdee (243626) | about a year and a half ago | (#41935783)

Sure efficiency matters, but only in portable devices. Desktops or other computers connected to the mains don't have a problem.

Hey its winter already, a watt used by your CPU is a watt less that has to be used by your radiant or convective heater.

Re:Power (2)

pclminion (145572) | about a year and a half ago | (#41935871)

This is only the case if your heat is electric. Otherwise you're comparing apples and oranges.

Re:Power (1)

Cinder6 (894572) | about a year and a half ago | (#41936599)

Now to expose my woeful lack of understanding of the topic!

Is it even apples to apples with electric heaters? I'm not sure how much power my PC is currently drawing, but its exhaust isn't particularly warm--in fact, it feels perceptively cooler than the ambient temperature. I have no doubt there's a sort of "wind chill" factor going on (it's not a magic PC, so far as I know), but it seems like a damned inefficient heating appliance all the same, especially if I consider space heaters I've used in the past.

Re:Power (1)

pclminion (145572) | about a year and a half ago | (#41936647)

The vast majority of power consumed by the computer ends up as heat. Computers make heat, light (EM), and sound. Sound is mostly absorbed by the walls of your house. The amount of EM which leaks out of the case and after that, past the walls of your house, is pretty negligible. Note that "negligible" doesn't mean "not detectable," you can easily detect it, it just doesn't amount to much.

Cinder6 (1)

Cinder6 (894572) | about a year and a half ago | (#41936823)

So is the PC just an inefficient heater, then? Even my aluminum case is cold to the touch. If I didn't have so many fans (10 in total), would it make the room hotter?

I'm asking because I often see it claimed that PCs make great space heaters, but in my experience, this one plain doesn't. Under full load, it should draw quite a bit of power, but it outputs much, much less heat than lower-energy dedicated space heaters. I'm tempted to find my Kill-A-Watt and see what it says.

Re:Power (1)

rossdee (243626) | about a year and a half ago | (#41936811)

The heat that I can control is electric. The furnace is controlled by a thermostat that is upstairs.
And we live on the north side of the building

Re:Power (2)

dmacleod808 (729707) | about a year and a half ago | (#41935951)

My basement is not heated so well, even though the furnace is down there. My computer keeps me warm.

Re:Power (0)

Anonymous Coward | about a year and a half ago | (#41935967)

electricity is a very bad choice for heating your house, efficiency/ecologicaly-wise. it's a very pure form of energy that is best suited for other uses.

Re:Power (4, Insightful)

Chewbacon (797801) | about a year and a half ago | (#41936099)

Efficiency matters to people who have many desktops around the home or office. Datacenters are focusing on efficient servers. Yeah, it does. Just because you're plugged into the wall doesn't mean that energy is infinite.

Re:Power (4, Insightful)

xlsior (524145) | about a year and a half ago | (#41936317)

Hey its winter already, a watt used by your CPU is a watt less that has to be used by your radiant or convective heater.

Except in the summer every watt used by your CPU requires your air conditioner to use more energy to counteract it.

Re:Power (1)

geekoid (135745) | about a year and a half ago | (#41936459)

Wrong.

You assume one watt of electric being converted to heat is the same as one watt converted by a heater. There are different devices with different inefficiencies.

Re:Power (1)

Anonymous Coward | about a year and a half ago | (#41936909)

Unless you're talking about a heat pump or other non-resistive method of creating heat, a processor is exactly as efficient as an electric heater drawing the same power.

The difference is that you get some useful computation out of the processor while you do it.

This thing again. (1)

Anonymous Coward | about a year and a half ago | (#41935863)

Moore's law is about the number of transistors on a chip. It has never been about performance.

Re:This thing again. (0)

Anonymous Coward | about a year and a half ago | (#41936577)

Wait? I thought Moores law was about how some italian mobsters great great great grandmother fucked a nigger?

HAHAHAHAHHAHA I LOVE THIS GUY!

RIP Tony Scott, thank you QT for the great dialogue.

Here come the ARM zombies (5, Insightful)

AcidPenguin9873 (911493) | about a year and a half ago | (#41935903)

Sigh. It seems there is a new, hip, propaganda trend on Slashdot: pro-ARM articles are posted, and a bunch of ARM zombies come out saying how anything ARM makes will (magically) be lower-power or more power-efficient than anything x86.

So I'll start a tradition of posting this same response every time (originally posted by me here [slashdot.org] ):

"ARM isn't magic; there is nothing in the ARM ISA that makes it inherently lower power than x86. Yes, I'm counting all the decode hardware and microcode that x86 chips need to support legacy ISA. There just isn't much power burned there compared to modern cache sizes, execution resources, and queue/buffer depths which all high-performance cores need regardless of ISA. If you have an x86 processor that targets A9 performance levels, it will burn A9 power (or less if Intel makes it, given Intel's manufacturing advantage). If you have a ARM processor that targets Sandy Bridge performance levels, it will burn Sandy Bridge (or more) power."

Re:Here come the ARM zombies (-1)

Anonymous Coward | about a year and a half ago | (#41936073)

Aaaaaaaarrrrrrrmmsssss!!

Re:Here come the ARM zombies (3, Funny)

ArcadeMan (2766669) | about a year and a half ago | (#41936221)

Aaaaaaaarrrrrrrmmsssss!!

Re:Here come the ARM zombies (1)

ArcadeMan (2766669) | about a year and a half ago | (#41936249)

I'm no engineer but I would assume that it's easier to manage power consumption by shutting down unused cores on a computer with 256 ARM cores than a computer with 4 x86 cores.

Re:Here come the ARM zombies (0)

Anonymous Coward | about a year and a half ago | (#41936795)

Good thing you're not an engineer then.

It depends on what world you live in (0)

Anonymous Coward | about a year and a half ago | (#41935943)

It depends on what world you live in. In consumer PCs, Moore started fading in the late 90s. I seem to recall that as the transition period from "faster, but always $2000" to "about the same speed, and getting cheaper all the time".

I think speed was less important on consumer PCs once they became capable of video playback. That's the most compute intensive thing that most consumer PCs need to do. Further advances in the hardware were drivin by bloated OS and better audio-visual quality.

Step outside the consumer PC bubble, and speed still matters a lot. I don't think Google with its football field sized rooms full of servers thinks performance is no longer relevant. Since chips are a commodity, that performance tends to percolate down to the consumer level.

No, it's still Moore's law (2)

Weaselmancer (533834) | about a year and a half ago | (#41935963)

It is just expressing itself differently as we begin to hit the wall with process size decreases and speed increases. If wattage of the cpu goes down, you can pack more cores into the same area. Computing power is still going up.

Innovation (0)

Anonymous Coward | about a year and a half ago | (#41935965)

'There’s been a lot more innovation in the world of mobile phones over the last 15-20 years than there has been in the world of PCs.'

While this statement may be true, it is true ONLY because mobile phones have been desperately trying to become PCs, and they're still a long ways off IMO. Without the innovation in the world of PCs we wouldn't have seen said innovation in the mobile phone world.

One system built on top of another.

Not So Sure About That (3, Interesting)

medv4380 (1604309) | about a year and a half ago | (#41936059)

Efficiency only really matters when supply is limited. On a cell phone or any portable system power is limited, and improvement in power efficiency will extend battery life. ARM is a good option when it comes to things like a tablet, but when you start to do everything an Intel styles chip can do they start to get more tricky. Sure ARM probably has a lower floor so it's minimum power usage is a lot lower, but when you start having it do everything in the same time span as an x86_64 does then it starts to look too similar to actually matter. Unless using an ARM processor can save me well over 500 bucks a year on my power bill I don't see their efficiency as actually that much of an advantage.

It's already true (1)

ArcadeMan (2766669) | about a year and a half ago | (#41936197)

There's already a lot of comments along the lines of "of course he's going to say that, he's the CEO of ARM", but if you think about it, a lot of people who used to buy desktop computers and laptops don't NEED full-fledged computers. No matter which OS you choose, a tablet is more than enough for email, web browsing, instant messaging, taking and managing photos.

There's always going to be a place for full-fledged computers, but the ratio of regular users vs programmers makes us a rounding error at the end of the day.

Re:It's already true (0)

Jackie_Chan_Fan (730745) | about a year and a half ago | (#41936547)

Yeah, who needs a faster, more powerful computer? You're right, only a few people really needed to step up from 286s...

Every year we do more with our computers. For fuck sake people, remember the BBS days? Now think about the youtube days of today, netflix, complex games, video chat, video conferencing, more bandwidth, more cpu, more ram....

WE ALWAYS needed more more more, and we still do.

Pocket devices are cute... but they will always be throw away devices limited by their design.

Moores law == cost per transistor (1)

WaffleMonster (969671) | about a year and a half ago | (#41936247)

I think at the end of the day what really matters whenever moores law is invoked is the underlying issue of cost per transister... I don't see cost ever being relegated to irrelevant.

As transistors get cheaper you can take any combination of two paths:

1. Build cheaper gear with same capabilities.

2. Cram more into the same device to increase capabilities while maintaining price.

Either way moores law is still critically important to the industry no matter who wins a CPU architecture war.

With regards to ARM vs x86 I am content to make some popcorn and watch from the sidelines as both sides talk shit and throw down ever more energy effecient gear.

RISC vs CISC redux (0)

Sebastopol (189276) | about a year and a half ago | (#41936333)

Same arguments from the 1990's... again. Remember when the 68000 was lower power than Pentium60 because it was RISC?

Yup, that all over again.

Even the magnitude of the design wins are the same today as they were in the early 90's.

Let's see who wins this round! Can the heavyweight (Intel) slim down to welterweight and defeat ARM?

Or will the both live together spurring on the same innovation (and crazy, low low prices!) we got from the AMD v. Intel race to 1GHz?

Tune in next week for more...

TALES!
OF!
INTEREEEEEEEESSSSTTTT!

Re:RISC vs CISC redux (0)

Anonymous Coward | about a year and a half ago | (#41936667)

68000 was CISC, not RISC. RISC in 1990s = MIPS, SPARC etc.

Let me know when phones become render farms. (2)

Jackie_Chan_Fan (730745) | about a year and a half ago | (#41936505)

Mobile chips are shit to people who need renderfarms, simulation farms etc. People still do real work out there.

People who keep crapping on workstations and servers seem to think everyone just needs a computer for texting, facebook and angry birds.

Architecture is becoming irrelevant (2)

gr8_phk (621180) | about a year and a half ago | (#41936685)

He misses another point (though reference to competition hints at it). With Apples switch from PowerPC to x86 and now this move to ARM, and Linux going mainstream via Android on ARM, software developers are getting ever better at making things portable and hence making the underlying CPU architecture irrelevant. Also notice that Android tried to make this explicit by running most stuff on the Dalvik VM.

Sure, power efficiency and die-area are important in many places, but don't think ARM is somehow going to have a lock on that.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...