Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Less Is Moore

timothy posted more than 4 years ago | from the 3-slugs-from-a-.44 dept.

Businesses 342

Hugh Pickens writes "For years, the computer industry has made steady progress by following Moore's law, derived from an observation made in 1965 by Gordon Moore that the amount of computing power available at a particular price doubles every 18 months. The Economist reports however that in the midst of a recession, many companies would now prefer that computers get cheaper rather than more powerful, or by applying the flip side of Moore's law, do the same for less. A good example of this is virtualisation: using software to divide up a single server computer so that it can do the work of several, and is cheaper to run. Another example of 'good enough' computing is supplying 'software as a service,' via the Web, as done by Salesforce.com, NetSuite and Google, sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway. Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version. That could be bad news for computer-makers, since users will be less inclined to upgrade — only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles."

cancel ×

342 comments

Let's see (5, Funny)

Rik Sweeney (471717) | more than 4 years ago | (#26642547)

Less: 120884 bytes
More: 27752 bytes

Wow, that's right!

Re:Let's see (5, Funny)

Moridineas (213502) | more than 4 years ago | (#26642595)

Even more literally..!

$ ls -i /usr/bin/less
3603778 /usr/bin/less
$ ls -i /usr/bin/more
3603778 /usr/bin/more

Re:Let's see (5, Funny)

Tibor the Hun (143056) | more than 4 years ago | (#26643133)

All the windows experts are scratching their heads now.
That's OK, maybe that'll make them get off of our lawn too.

Re:Let's see (3, Insightful)

jank1887 (815982) | more than 4 years ago | (#26643411)

you can pipe to more from a DOS prompt too. so a few of us get to stay on the lawn.

Bad Logic (-1)

eldavojohn (898314) | more than 4 years ago | (#26642555)

For years, the computer industry has made steady progress by following Moore's law ...

I guess that's what happens when you cut and paste computer science terms from an Economist article. In the next sentence, you state correctly that Moore's "Law" is an observation not a law! It's not that the computer industry (and I think we're only talking hardware here) follows this observation, it's that historically it has held true. No one's going to make a huge leap in R&D to be able to put 10x the number of transistors on a chip only to have engineers come down on them to stop it saying "no one has ever broken Moore's Law and we're not going to start now!" That idea is preposterous. We're limited by our own technology that happens to follow an ok model, it's not a choice!

Another example of 'good enough' computing is supplying 'software as a service,' via the web, as done by Salesforce.com, NetSuite and Google, sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway.

I don't see how SaaS supports the idea of consumer demand for yesterday's computer cheaper today. SaaS is a new business model for software, not a response to us settling on "good enough" computers--and it's been growing for a lot longer than just recently. The software of yesterday can still run on this hardware. Google Docs, online CRM & accounting tools are SaaS but they are not evidence of consumers wanting cheaper netbooks with less computing power. I think you will find that OO.o runs just as well on a netbook as Google docs, I know my older machines have a hard time with the amount of memory my browser sucks up when looking at a large Google spreadsheet or Google doc. Again, I don't think the logic behind SaaS is we need to cope with weaker client machines, I think SaaS has other more important benefits like netting more money, avoiding piracy, easier to patch, etc.

Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version.

Aside from how ridiculous that statement sounds to me ("Vista makes your computer run faster?"), it is my opinion that the Vista Capable debacle [slashdot.org] drove this more than consumer demand. Microsoft is notorious for ignoring customer desires to fix what they have and offering unprompted additions and UI changes.

I think what you are witnessing is consumers and businesses hurting because of the shrinking economy and a $250 netbook is looking mighty affordable to them. This isn't going to stop any of the companies doing R&D to keep pace with Moore's observation.

Re:Bad Logic (-1)

MrEricSir (398214) | more than 4 years ago | (#26642675)

Moore's law is hardly an "observation." The guy ran Intel, it was his directive to his employees.

Re:Bad Logic (5, Informative)

AKAImBatman (238306) | more than 4 years ago | (#26642849)

Um, no it wasn't. "Moore's law" is a term that was coined after Thomas Moore gave a presentation showing that the company was managing to double transistor density each month. This observation created an interesting problem for the company. What should they do with all those extra transistors?

One option was that they could keep getting higher yields on existing chips, eventually driving the cost per unit to mere fractions of a penny. The other option was that Intel could do something useful with all that extra circuitry and maintain higher prices.

Considering that contemporary CPUs of the time were barely more powerful than the interrupt controller sitting next to them, using that silicon for sophisticated 32bit processors with on-die floating point units and SIMD instructions seemed like a no-brainer for the company. Thus as each successive generation of technology has made CPUs smaller, Intel has used the extra space to add more features and more optimizations.

At this point, things are getting a bit ridiculous. CPU manufacturers have so much extra space on which to work that they can fit 2-4 CPU cores on a single die and STILL produce a smaller chip than the last generation.

Re:Bad Logic (1)

AKAImBatman (238306) | more than 4 years ago | (#26642983)

s/Thomas/Gordon/g

*waves hand* These aren't the typos you're looking for...

Re:Bad Logic (5, Informative)

commodore64_love (1445365) | more than 4 years ago | (#26643047)

And now I'm going to do something shocking and unprecedented. I'm going to look-up the actual quote, instead of guessing what Moore's "Law" means.

"April 1965:

"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ..." Notice he claimed *complexity* not power doubled, and that it happened EVERY year. His original statement has not held true.

Re:Bad Logic (1)

AKAImBatman (238306) | more than 4 years ago | (#26643427)

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year

Translation: The number of transistors doubles.

I said nothing about "power". Those are your words, not mine.

Re:Bad Logic (1)

jawtheshark (198669) | more than 4 years ago | (#26642869)

Come on! Are you for real? It is an observation a co-founder of Intel.... Look it up on Wikipedia [wikipedia.org] : The trend was first observed by Intel co-founder Gordon E. Moore in a 1965 paper..

Re:Bad Logic (1)

ShieldW0lf (601553) | more than 4 years ago | (#26642987)

One might even interpret it as a self fulfilling prophecy.

Why sell chips that are 10x as fast, when you can sell chips that are 2x as fast, then sell the same people new chips that are 2x as fast, repeatedly.

It almost seems like a cartel engaged in price fixing. I expect that time will reveal that is what it always has been...

Re:Bad Logic (3, Insightful)

jonbryce (703250) | more than 4 years ago | (#26643125)

Not really, especially in the days when you had Intel and AMD racing to be the producer of the fastest chip.

Re:Bad Logic (1)

jawtheshark (198669) | more than 4 years ago | (#26643167)

Wow.... Okay, any conspiracy theorists on that one? :-P

Re:Bad Logic (0)

Anonymous Coward | more than 4 years ago | (#26642693)

[...] Moore's "Law" is an observation not a law!

Couldn't you say the same about Newton's Laws?

Re:Bad Logic (0)

Anonymous Coward | more than 4 years ago | (#26642905)

Couldn't you say the same about Newton's Laws?

Possibly, it depends on who you ask. Some philosophers argue that laws of nature are simply repetitions of observation which do nothing to prove that the one-millionth (+1) time the same thing will occur (Hume). Physicists attempt to tie underlying principles of mathematics with empirical observations to cogently state the the under-lying mechanics of the nature of reality operate with universality.

So, at the end of the day, fuck it, buddy: it's turtles all the way down.

(Oh, by the way, this is mfh [slashdot.org] posting as AC to avoid the karmic hurtings of the badness of the /. nature)

Re:Bad Logic (1)

ktappe (747125) | more than 4 years ago | (#26642851)

Moore's "Law" is an observation not a law

And you're arguing semantics, not actual facts. Ever heard of "gravity, it's not just 14ft/sec^2, it's the law"? Same usage.

I think what you are witnessing is consumers and businesses hurting because of the shrinking economy and a $250 netbook is looking mighty affordable to them.

...which is pretty much what the original article is stating: consumers want cheaper prices not faster PC's.

This isn't going to stop any of the companies doing R&D to keep pace with Moore's observation.

It certainly will slow R&D as they lay off workers, so I challenge you on this point too.

Re:Bad Logic (1, Funny)

Anonymous Coward | more than 4 years ago | (#26642957)

Ever heard of "gravity, it's not just 14ft/sec^2, it's the law"

Where do you live?

Re:Bad Logic (1)

Moridineas (213502) | more than 4 years ago | (#26642965)

And you're arguing semantics, not actual facts. Ever heard of "gravity, it's not just 14ft/sec^2, it's the law"? Same usage.

Not to be overly pedantic in this thread on semantics, but... 14ft/sec^2??

http://www.google.com/search?q=1+g+to+ft/s%5E2 [google.com]

Re:Bad Logic (0)

Anonymous Coward | more than 4 years ago | (#26643005)

...which is pretty much what the original article is stating: consumers want cheaper prices not faster PC's.

Oh come on. Did you really expect him to RTFS?

Re:Bad Logic (3, Interesting)

Chris Burke (6130) | more than 4 years ago | (#26642885)

I guess that's what happens when you cut and paste computer science terms from an Economist article. In the next sentence, you state correctly that Moore's "Law" is an observation not a law! It's not that the computer industry (and I think we're only talking hardware here) follows this observation, it's that historically it has held true. No one's going to make a huge leap in R&D to be able to put 10x the number of transistors on a chip only to have engineers come down on them to stop it saying "no one has ever broken Moore's Law and we're not going to start now!" That idea is preposterous. We're limited by our own technology that happens to follow an ok model, it's not a choice!

Yes, that's all true, but if you don't think chip makers throw up graphs with a curve on them for Moore's Law and use that as a guideline for where they should be in the future, which could be called "following"... you're mistaken. Obviously if the observation continues to hold true, that's only because of the advances in R&D that produce new technology. However those advances come as a result of choices, like how much and what kind of R&D to do, and those choices are themselves driven in part by Moore's Law.

Now as far as going faster and getting 10x more transistors on a chip, sure that's not much of a choice. That's because the industry is already busting its ass to maintain the current exponential trend. For that very reason I'd never take the phrase "following Moore's Law" to mean intentionally limiting technology advancement. Au contraire, if anything I take it to mean we're "following" in the sense that you'd be "following" Usian Bolt in the 200m dash -- if you're anywhere near keeping up, you're a bad ass. The only motivation would be to drop off that pace.

Which, to some extent, we've already seen in the 00's. It's still exponential growth, but the time factor has increased somewhat. I can't remember the data I saw, but it appeared to have gone from a doubling every 18 months to 24?

By the way, I agree the examples are pretty poor. For virtualization you want the newest beefiest processor with the best hardware support for virtualization you can get. The whole idea is that you want a single machine to appear as though it is a plethora of machines each with enough horsepower to do whatever that specific machine needs to do. This is the opposite of just wanting to do the same thing cheaper, it's wanting to do the same thing times a plethora, so you need a machine that is at least one plethora times as powerful. Being cheaper overall is just a desirable side effect. I hope you agree that "plethora" is a great word.

Re:Bad Logic (1)

BobMcD (601576) | more than 4 years ago | (#26643105)

Yes, that's all true, but if you don't think chip makers throw up graphs with a curve on them for Moore's Law and use that as a guideline for where they should be in the future, which could be called "following"... you're mistaken. Obviously if the observation continues to hold true, that's only because of the advances in R&D that produce new technology. However those advances come as a result of choices, like how much and what kind of R&D to do, and those choices are themselves driven in part by Moore's Law.

Precisely. Those choices are key and the Moore's Law expectation absolutely has to factor into it somewhere. My own opinion is that it sets the limit for where they can stop and relax their efforts, internally.

That's because the industry is already busting its ass to maintain the current exponential trend. For that very reason I'd never take the phrase "following Moore's Law" to mean intentionally limiting technology advancement.

Given the amount of secrecy in this industry, I'm not certain how you can back this statement up with any fact. My own assumption is that 'they' have developed technology far more capable than what they currently claim to be working on at any given time. I personally believe that what they claim is on the drawing board is actually in prototype, what they claim to be in dev is actually ready for production, and their 'latest and greatest' is already old tech.

In my mind, this is the only way to sustain this curve - by limiting the release of new technology onto the market until Moore's says that it is time for it.

Think about it in terms of least-effort: What's easier, to control the rate of release, or to 'bust your ass' keeping up with Usian Bolt?

Re:Bad Logic (0)

Anonymous Coward | more than 4 years ago | (#26642917)

I would also like to add that anyone who says "$adjective is $opposite" to look smart deserves a foot up the ass.

Re:Bad Logic (1)

poot_rootbeer (188613) | more than 4 years ago | (#26642929)

I think what you are witnessing is consumers and businesses hurting because of the shrinking economy and a $250 netbook is looking mighty affordable to them.

Even more affordable: keep using the same computer I've been using up until now, even if it's four or five years old at this point, for a cost of $0. It's still 'fast enough' for almost everything I'd want to do, and if I start to run out of disk space, I can buy external USB storage for $100 per terabyte.

Re:Bad Logic (1)

Bob-taro (996889) | more than 4 years ago | (#26642969)

"Vista makes your computer run faster?"

I think you misread the article. They claim Windows 7 is going to be the first Windows version that is faster than it's predecessor (in this case Vista) on the same hardware.

Re:Bad Logic (1)

jonbryce (703250) | more than 4 years ago | (#26643161)

Wasn't NT4 faster than NT 3.51? Or at least, it had lower memory requirements.

Re:Bad Logic (1)

jonbryce (703250) | more than 4 years ago | (#26643053)

The most popular SAAS application is webmail. People use that because it saves the hassle of setting up an email client, and they can use it from any computer by just firing up a web browser.

I personally don't like it because I like to have my data stored on my own computer, although I do have webmail software installed on my own computer so I can read my emails when outside.

Re:Bad Logic (5, Insightful)

timholman (71886) | more than 4 years ago | (#26643085)

Microsoft is notorious for ignoring customer desires to fix what they have and offering unprompted additions and UI changes.

Yes, and as so many have pointed out, their history of doing so is now backfiring on them in a big way. And it's not just with Vista, it's with Office as well.

Case in point - several months ago my department bought upgrade licenses to Office 2008. I was perfectly happy with Office 2004, but I installed Office 2008 because I knew that if I didn't, I wouldn't be able to read whatever new formats that Office 2008 supported. It had happened with every other Office upgrade cycle in my experience - you either upgraded or you'd be unable to exchange documents with your peers.

But something funny happened this time - I have yet to receive a .docx, .xlsx, or .pptx file from anyone. I have quite consciously chosen to save every document in .doc, .xls, or .ppt "compatibility" format. Everybody I talk to says they're doing exactly the same thing. Everyone now knows the game that Microsoft plays, and no one is willing to play it anymore. I could have stayed with Office 2004 and never noticed the difference. So what motivation will I have to upgrade to the next version of Office?

If it weren't for Microsoft's OEM licensing deals, Vista would have a tiny fraction of its current market share. XP is "good enough". But Microsoft doesn't push Office onto new machines the way it does Windows, the older Office formats are also "good enough", and you have open source alternatives like OpenOffice if Microsoft tries to deliberately break Office compatibility on the next version. I fully expect Microsoft's Office revenues to take a steep dive in the next few years. The Vista debacle is only the beginning.

Re:Bad Logic (1)

jonaskoelker (922170) | more than 4 years ago | (#26643407)

[...]If so, it will be the first version of Windows that makes computers run faster than the previous version.

Aside from how ridiculous that statement sounds to me ("Vista makes your computer run faster?")

  • Order your computer to perform some task, and note the time
  • Wait for the task to finish, observe the time
  • Compare the difference between the two times to a measured difference on another system
  • Conclude that one system makes your computer perform tasks ("run faster") than the other

Of course ${OS} ${VERSION} won't bump your CPU cycle frequency or increase your cache size.

But if one OS performs tasks in less time than another, if one thinks of "the computer" as an (OS, hardware) bundle, it does make the computer run faster.

Is it really that ridiculous?

Ecomomics (2, Interesting)

Mephistophocles (930357) | more than 4 years ago | (#26642559)

Proof that Moore's law is driven by economics as much as (or even more than) technological discovery/innovation?

Operating versus capital (1)

goombah99 (560566) | more than 4 years ago | (#26643137)

Proof that Moore's law is driven by economics as much as (or even more than) technological discovery/innovation?

You have a good point that this could be a test of your hypothesis. The purchase of a computer to a company or governement is frequently considered a "capital" purchase. Even though over time, the cost of computing is dominated by the operating cost of software, power, upgrades and IT.

However since capital is usually scare in organizations it tends to drive acqusitiion decisions. People buying things that they can't easily replace will tend to seek higher perfromance equipment.

But that may be about to change. things like cloud computing let people push their computer budgets into operating budgets. If they are lucky they might even become "overhead" contracts paid for bythe company rather than by the project.

So there is a huge incentive now to further comoditize computing to the point where people compete at the bottom of the barrel rather than at the high end.

Dell has sought to prevent that. For example, you can't buy those el cheapo dell's you see advertised in the back of the sunday paper glossy if you work in the govenrment or a large corporation.

But the netbooks now are cracking that facade since you can buy those on a gov't contract. dell will have to respond at some point.

it's a new economics. Will it drive moore's law too?

Wait for it... (0, Troll)

djupedal (584558) | more than 4 years ago | (#26642581)

...around here, more is Microsoft. So, let's get on with it and find a tie-in so we can cash our payoff checks. Oh! Wait...there it is!

Microsoft is everything /. and /. is everything microsoft! What a wonderful thing, being able to come to one place and read all about wonderful microsoft...just wonderful!

Re:Wait for it... (1)

Farmer Tim (530755) | more than 4 years ago | (#26642857)

What a wonderful thing, being able to come to one place and read all about wonderful microsoft...just wonderful!

Forewarned is forearmed, as they say. Still, at least we're not discussing Apple imploding into Steve Jobs' digestive system again, or gaining vital insight into which brand of deodorant* Linus Torvalds uses.

*Real Linux developers don't use deodorant, so that'd be silly.

Faster Windows! (1, Insightful)

D Ninja (825055) | more than 4 years ago | (#26642583)

If so, it will be the first version of Windows that makes computers run faster than the previous version.

Nooo...computers are running at exactly the same speed. They just won't have to chew through bloated software. Microsoft is (supposedly) making their software more efficient.

Can't stand writers who don't understand tech.

Re:Faster Windows! (1)

liquidpele (663430) | more than 4 years ago | (#26643029)

This *exactly* what I thought. You can't go cheaper without using older or more efficient software because all the new stuff requires powerful hardware! Of course, this is one place Linux can really shine, because even though it has gotten more bloated on most distro's default installs, it's quite easy to trim out what you don't need.

Because you don't need more cycles in biz (4, Insightful)

Opportunist (166417) | more than 4 years ago | (#26642609)

Let's be honest here. What does the average office PC run? A word processor, a spreadsheet, an SAP frontend, maybe a few more tools. And then we're basically done. This isn't really rocket science for a contemporary computer, it's neither heavy on the CPU nor on the GPU. Once the computer is faster than the human, i.e. as soon as the human doesn't have to wait for the computer to respond to his input, when the input is "instantly" processed and the user does not get to see a "please wait, processing" indicator (be it a hourglass or whatever), "fast enough" is achived.

And once you get there, you don't want faster machines. More power would essentially go to waste. We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.

Re:Because you don't need more cycles in biz (0)

Anonymous Coward | more than 4 years ago | (#26642703)

Let's be honest here. What does the average office PC run? A word processor, a spreadsheet, an SAP frontend, maybe a few more tools.

I have sympathies with Tim Allen, I always believe in "More power!"

I believe in the future, computers will have to get more powerful for AI and advanced UI (recognizing what the user wants and not just through the keyboard and mouse). There are real productivity gains still to be had in that area for the regular business.

Re:Because you don't need more cycles in biz (0)

Anonymous Coward | more than 4 years ago | (#26642829)

Artificial intelligence is not really that computationally complex. Current hardware can perform voice recognition easily, for example. Improving artificial intelligence requires improving current algorithms rather than throwing more juice at the problem.

Today's code word is: [redneck]

Re:Because you don't need more cycles in biz (0)

Anonymous Coward | more than 4 years ago | (#26643237)

Eh, maybe. As a CS teacher of mine said, the field of AI still hasn't had its Einstein. We really don't know what the future of AI looks like. What if we can develop models that give incredible results, but take a lot of processing? It's not out of the question.

Re:Because you don't need more cycles in biz (1)

AKAImBatman (238306) | more than 4 years ago | (#26642937)

I have sympathies with Tim Allen [...] I believe in the future, computers will have to get more powerful for AI

Am I the only one who read that as, "computers will have to get more powerful for Al [Borland]"?

I'm sitting here, doing my best Kirk impression, thinking "What does Al need with a more powerful computer?" (Or a starship, even?)

Re:Because you don't need more cycles in biz (4, Insightful)

Hognoxious (631665) | more than 4 years ago | (#26642805)

as soon as the human doesn't have to wait for the computer to respond to his input, when the input is "instantly" processed and the user does not get to see a "please wait, processing" indicator (be it a hourglass or whatever), "fast enough" is achived.

It's Moore's other law - once fast enough is achieved, you have to slow it down with shite like rounded 3d-effect buttons, smooth rolling semi-transparent fade-in-and-out menus and ray-traced 25 squillion polygon chat avatars.

Re:Because you don't need more cycles in biz (1, Troll)

Opportunist (166417) | more than 4 years ago | (#26642893)

If I cannot turn that crap off, I don't want the software. If I can turn it off, I turn it off.

An interface is supposed to do its job. When I play games or when I watch an animation, I want pretty. When I work, I want efficiency. Don't mix that and we'll remain friends.

Re:Because you don't need more cycles in biz (1)

Skater (41976) | more than 4 years ago | (#26642927)

We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.

Exactly. I have a 5-year-old laptop with a Pentium 2.4 gigahertz processor, but even with today's software (latest versions of OO.org, Firefox, Google Earth, etc.) it runs just fine. Sure, a newer computer would be somewhat faster, but this is not "so slow it's painful" like my Pentium 133 was 5 years after I bought it.

It works well enough that I recently put in a larger hard drive and a new battery to keep it useful for the foreseeable future, because I do not intend to replace it until it dies (or until I have all kinds of 'extra' money to blow on a laptop). And I'm seriously considering trying to find someone to pay to get suspend/resume working on it under Linux (I believe it has a quirky BIOS since ACPI was still fairly new back then; BSD was also unable to successfully resume on it).

Re:Because you don't need more cycles in biz (1)

jollyreaper (513215) | more than 4 years ago | (#26642943)

And once you get there, you don't want faster machines. More power would essentially go to waste. We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.

Agreed. There would have to be a new paradigm shift (ok, I fucking hate that word, maybe usage need change?) to make an upgrade worthwhile.

For my personal needs, DOS was good for a long time. Then Win95 came out and true multitasking (ok, kinda working multitasking that still crashed a lot) made an upgrade compelling. I couldn't really use any of the browsers on my old dos box and Win95 opened a whole new world. That computer got too slow for the games eventually and that drove the next upgrade. Online videos helped drive the next upgrade.

As far as business computers go, there's not anything else compelling to urge the next upgrade. Microsoft Office hasn't really advanced much since Office 2000, just gotten slower. The user interface isn't any more intuitive, there are no high-resource but awesome results features like true voice recognition, nothing to make an office manager's eyes go wide and say "We must upgrade for this."

So far we're seeing cooler ideas in the form of new software but nothing that really justifies an upgrade, especially for office users. About the only upgrade I think is justifiable is multiple monitors for people who are having to keep a bunch of plates spinning at the same time.

Re:Because you don't need more cycles in biz (1)

Opportunist (166417) | more than 4 years ago | (#26643123)

And, bluntly, I don't see any "we must have this!" features in any office standard application, at least since 2000.

Multitasking was a compelling reason. It became possible to run multiple applications at once. Must-have.

Better interveaving between office products and email was a must have, too.

Active Directory (and other, similar technologies) made administrating multiple accounts a lot easier and certainly helped speeding up rollouts. Also, must-have (for offices, but we're talking office here).

And so on. But we are now pretty much at the apex of usability. I can't really think of anything (office related) that's absolutely necessary that any Windows OS since 2k is lacking. Maybe there's a little room in problem diagnosis (i.e. a tool to make helpdesk requests go faster and make it easier to determine the source of the problem), a bit of room is also still in the ability to weave smartphones and other mobile devices more seamlessly into the email/office applications, but I don't really see any room for improvement in the "ordinary" everyday office work.

Re:Because you don't need more cycles in biz (2, Interesting)

zappepcs (820751) | more than 4 years ago | (#26642947)

Lets be honest here. What would we like the average office PC to be doing? If they are beefy enough to run a grid on, and so also perform many of the data retention, de-duplication, HPC functions, and many other things, then yes, having faster-better-more on the desktop at work could be interestingly useful. Software is needed to use hardware that way, meh.

There will never be a time in the foreseeable future when beefier hardware will not be met with requirements for its use. Call that Z's corollary to Moore's Observation.... if you want.

Re:Because you don't need more cycles in biz (1)

magisterx (865326) | more than 4 years ago | (#26642963)

For most office users, you are completely correct. There does remain plenty of us who do more intensive analysis and code compilation that can require more time and faster hardware is quite welcome there.

More interestingly, faster hardware can open up new applications that no one is seriously trying right now. It would go a long way to assisting with voice recognition for one thing and handwriting recognition can improve too. And those are what I can think of off hand. As better hardware is provided new things will be developed to use it.

Re:Because you don't need more cycles in biz (1)

mrbcs (737902) | more than 4 years ago | (#26643165)

EXACTLY! You sir, fine Opportunist have won the thread! This is why the industry will almost die in the next few years. The upgrade cycle has died long ago. People are happy with their machines and the last (arguable) thing to cause upgrades were games.

I have an AMD dual core 4400 with a couple Nvidia 7300's that will take anything I throw at it. I don't think I'll ever need another computer unless the thing fries. They have also become so cheap that they have also become commodities. Why fix something for $200 when you can have the new faster version for $350?

Sad that the end result of cheap computers and shitty microsoft software will be land fills full of perfectly functional computers.(once they are reinstalled). I routinely pull p4 2 ghz machines from the "e-waste" trailer here. Sad.

Re:Because you don't need more cycles in biz (1)

Culture20 (968837) | more than 4 years ago | (#26643431)

the industry will almost die in the next few years. [...] I don't think I'll ever need another computer unless the thing fries.

Which means that clamoring for cheap will mean hardware makers will design _more_ early failure into hardware, and reduce warranties to nil.

Re:Because you don't need more cycles in biz (1)

chrysrobyn (106763) | more than 4 years ago | (#26643187)

Once the computer is faster than the human ... "fast enough" is achived ... And once you get there, you don't want faster machines. More power would essentially go to waste. We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.

If we were talking about CPU power, I'd completely agree with you. A Pentium IV was fast enough for most people, and a modern Core2Duo is more than enough. I still get to points where my system chokes on my hard drive (stupid virus checker among other corporate requirements). A solid state disk with lower throughput but an order of magnitude less seek time would improve my user experience -- since I'm not using more than 16 GB even with some personal MP3s, I'd hope the cost wouldn't be much higher than more spacious spinning media. Also, there are times when improving the network infrastructure would make things better, too. Not that my company LAN really has problems these days, but I often hit the maximum of what my home cable modem can do.

Maybe this is why Intel is reaching into the SSD arena, because CPUs are effectively "mature" for most people. As an example of "next generation just powerful enough CPUs", a dual core Atom (completely symmetric) or an Atom / half Merom hybrid (one core "stronger" than the other) would be interesting. As a corporate user, I wonder if some servers would benefit from a half dozen Atoms with a big brother I7 core to handle the heavy lifting.

Re:Because you don't need more cycles in biz (1)

DMalic (1118167) | more than 4 years ago | (#26643303)

We certainly haven't reached that point. However, it might be more about hard drive I/O than CPU speed, and it has nothing to do with the GPU.

Re:Because you don't need more cycles in biz (5, Funny)

Tubal-Cain (1289912) | more than 4 years ago | (#26643321)

We reached the point of "fast enough" years ago. Computers were so fast we needed a semi-useful way to waste cpu cycles. And so the GUI was born.

Re:Because you don't need more cycles in biz (1)

ChienAndalu (1293930) | more than 4 years ago | (#26643327)

You forgot "anti virus software". By fare the biggest (and probably least useful) resource hog.

Gamers need not apply (0, Offtopic)

StrifeJester (1326559) | more than 4 years ago | (#26642617)

Even in the downturn there will still be gamers that want more out of what they are playing. Look how many people are still waiting for something to play Crysis decently. and GTAIV is a joke with its lack of anti-aliasing.

Re:Gamers need not apply (1)

DMalic (1118167) | more than 4 years ago | (#26643353)

My 9600 GT is about.. six months plus old, and it cost about $100 then. It plays Crysis just fine, though at a low res (1440xwhatever). I like higher framerates, but don't need them, and the motion blur does a great job. That res isn't as nice as it could be, but it's better than what consoles will do their games. I agree with your point, though.

I say AWWWW fo botnets (1)

indi0144 (1264518) | more than 4 years ago | (#26642627)

An Athlon XP should be enought for almost any home user, unless you like to hand real power for your botnets.

Or... (2, Interesting)

BobMcD (601576) | more than 4 years ago | (#26642631)

That could be bad news for computer-makers, since users will be less inclined to upgrade only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles.

Or, it could be good news for them. Especially in the light of the things like the "Vista Capable" bru-ha-ha, and the impact Vista had on RAM prices when fewer than the projected number of consumers ran out to buy upgrades.

Maybe Intel and NVidia are going to be wearing the sadface, but I'm willing to wager HP and the like are almost giddy with the thought of not having to retool their production lines yet again. They get to slap on a shiny new OS and can keep the same price point on last year's hardware.

Some of the corporations in the world buy new hardware simply to keep it 'fresh' and less prone to failure. My own company has recycled a number of Pentium 4 machines that are still quite capable of running XP and Internet Explorer. With the costs of new desktop hardware at an all-time low for us, we get to paint a pretty picture about ROI, depreciation, budgets, and the like.

Re:Or... (2, Insightful)

craighansen (744648) | more than 4 years ago | (#26642775)

In fact, M$ might end up settling these Vista-Capable lawsuits by offering upgrades to W7, especially if it's faster on the barely-capable hardware that the subject of the suit. Cheap way to settle for them...

Re:Or... (1)

EmperorKagato (689705) | more than 4 years ago | (#26643031)

They won't settle and will not need to settle. The OEM companies are at fault for placing and selling vista capable products well below the required specs. Those machines met the Vista Basic requirements yet the machines were being sold as a machine that would handle Vista with flying colors.

Re:Or... (1)

Jason Earl (1894) | more than 4 years ago | (#26643183)

Unfortunately, if companies start buying computers in 5 or 6 year cycles instead of 2 or 3 year cycles then HP definitely won't be giddy. They'll be even less giddy if the average price of a desktop PC drops to $200 and the average price of a laptop sinks to $500.

Sigh (3, Insightful)

Bastard of Subhumani (827601) | more than 4 years ago | (#26642635)

companies would now prefer that computers get cheaper rather than more powerful

There's already a method for that: it's called by the catchy title "buying a slightly older one".

A related technique is called "keeping the one you've already got".

Re:Sigh (1)

robot_love (1089921) | more than 4 years ago | (#26642803)

A related technique is called "keeping the one you've already got".

I don't know... That sounds expensive.

It's funny, but it's also true. (1)

khasim (1285) | more than 4 years ago | (#26643269)

A previous company I worked for would lease their workstations for 3 years. That did mean that they were constantly paying for computers ... and rolling out new boxes.

But there weren't many problems with the HARDWARE during those 3 years.

As they started keeping the workstations longer, there were more problems with the hardware AND there were problems with replacing the hardware that broke. Which was leading to a non-uniform desktop environment. It's more difficult to support 100 different machines with 100 different makes / models, etc than it is to support 100 identical machines.

Re:Sigh (0)

Anonymous Coward | more than 4 years ago | (#26643273)

A related technique is called "keeping the one you've already got".

I don't know... That sounds expensive.

You've never been divorced...

Re:Sigh (0)

Anonymous Coward | more than 4 years ago | (#26643223)

"buying a slightly older one" has a possible downside of "More prone to breakdown and require replacement without a warranty..."

Less is More (0)

Anonymous Coward | more than 4 years ago | (#26642641)

With added features. Now if only bash would default auto-complete to use it...

Wait, there's a story somewhere?

This is nothing new (4, Interesting)

georgewilliamherbert (211790) | more than 4 years ago | (#26642665)

Some of you may remember the 1980s and early 1990s, where PCs started out costing $5,000 and declined slowly to around $2,500 for name brand models.

Around 1995, CPUs exceeded the GUI requirements of all the apps then popular (this is pre-modern gaming, of course). Around 1996 and into 1997 the prices of PCs fell off a cliff, down to $1,000.

Those who fail to remember history...

Re:This is nothing new (2, Interesting)

jawtheshark (198669) | more than 4 years ago | (#26643043)

Adjust those number for inflation... Or better, retro-adjust current prices for 1980 prices.

I do remember falling prices in the nineties, but now a PC is pretty close to an impulse buy. For me in 2000, a 2000€ PC was already an impulse buy (That said, I was single, a successful consultant with brand-new sports car, so my "impulses" were a bit off). These days an EEE PC is an impulse buy for anyone loving toys and having a bit spare money.

This is not a repeat of the previous price-falls, this is the computer becoming a throw-away consumer item like a toaster. (Running NetBSD obviously ;-) )

Poor Intel (1)

the eric conspiracy (20178) | more than 4 years ago | (#26642667)

Continually making the same thing for less money is not a very good business model.

Pretty soon the customers will be asking for the same performance, free.

Reminds me of the old quote, "We have being doing so much with so little for so long we are now qualified to do anything with nothing at all".

Recessions are wonderful things (5, Insightful)

benjfowler (239527) | more than 4 years ago | (#26642669)

This could simply be down to the tanking economy: people look at what they're spending, and quickly realise that:

1) the upgrade treadmill over the last twenty years has produced insanely powerful and dirt-cheap hardware. When was the last time you had trouble running Linux on your hardware? I'm old enough to remember!

2) and that you don't need teraflops of CPU/GPU power just to draw greasepaper-style borders around your Microsoft Word windows. Perhaps the entire industry has woken up and seen how unbelievably wasteful modern computing is, and have decided to take the dividend of Moore's Law in cash instead.

3) recessions are good for purging wasteful and suboptimal behaviour generally.

Maybe people will realize what an obscene waste of money and computing power and operating system like Windows Vista, which requires a gig of RAM to run, really is.

Smaller (1)

Ohio Calvinist (895750) | more than 4 years ago | (#26642677)

Cost is very important particularly in business, but for home users the price for a "good enough" PC has been in the same 600-1200 price range for a long time. What I think will drive sales particularly in the home, and mobile professional is size, have less wires, and use less energy. Folks are willing to pay more for that; rather than more powerful chips that they don't need. This should be good news for OEMs, because it is easier to show the "average" user a more aesthetic case, or more wireless peripherals, or a better/faster OS, than convincing them that Word, IE and MSN Messenger are going to be "better" on a 2.4GHz chip over a 2.0GHz chip.

The only companies that should be frightened are CPU manufactuers (particularly as more and more functions are being passed off to GPUs; and as developers are not catching up to multi-core development as fast as they'd like).

Re:Smaller (1)

Lonewolf666 (259450) | more than 4 years ago | (#26643289)

PC hardware has left software requirements somewhat behind, unless you want to run the very latest games.
My dual core PC from 2007 is still more than sufficient in terms of performance. The price to put a similar or better machine together has dropped from 800 Euros to 500 Euros, however (without monitor). That is assuming
-the same case
-a comparable power supply
-same memory (2 GByte)
-a slightly faster but less power-hungry CPU (AMD 45nm vs. 90nm, each in the energy-efficient version)
-a faster GPU (ATI 4670 vs. NVidia 8600GT)
-a harddisk with twice the capacity - 500 GByte disks are cheap today

If you add a good screen, you may end up paying 700-800 Euros. Cheap enough for most private users, and companies that try to squeeze a few hundred out of this at the expense of good working conditions are just plain stupid.

Less Moore (0, Offtopic)

89cents (589228) | more than 4 years ago | (#26642717)

There once was a man named Less Moore.
He got shot with 4 rounds from a .44.
No less, no more.

The bells and whistles nobody uses... (3, Interesting)

Tenek (738297) | more than 4 years ago | (#26642721)

sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway

I think if you took out all the features that 'hardly anyone uses' you wouldn't have much of a product left. Bloatware and the 80/20 Myth [joelonsoftware.com]

Re:The bells and whistles nobody uses... (1)

w0mprat (1317953) | more than 4 years ago | (#26643069)

So I'm starting to suspect that fretting about bloatware is more of a mental health problem than a software problem.

Amen.

Dirty Industry Secret (3, Interesting)

Jason Levine (196982) | more than 4 years ago | (#26642741)

Years back when everyone in the mainstream were trotting out how many Mhz/Ghz their processors ran and how their Latest And Greatest system was *soooo* much better, I insisted that the computer industry had a dirty little secret. The mid to low end computers would work just fine for 90% of users out there. Computer makers didn't want people knowing this and instead hoped that they would be convinced to upgrade every 2 or 3 years. Eventually, though, people learned that they were able to read their e-mail, browse the web, and work on documents without buying a system with a bleeding edge processor and maxed out specs. This seems like the continuation of the secret's collapse. People are realizing that not only don't they need to buy a system with a blazing fast processor just to send out e-mail, but they don't need to buy 10 different servers when one powerful (but possibly still not bleeding edge) server an run 10 virtual server instances.

Re:Dirty Industry Secret (0)

Anonymous Coward | more than 4 years ago | (#26642975)

We ended up virtualizing on a little known platform which sounds exactly like the kind of server you are talking about. Big on RAM, modest on CPU, plenty of network connectivity and a price that would shame Dell, not to mention HP and IBM. When you virtualize RAM is the critical resource more often than not. We went for the lower end of the v1624 appliances [360is.com] from 360is [360is.com] and haven't regretted it. Regularly run 40 VMs per host.

LB.

The EEE PC did prove this (1)

jawtheshark (198669) | more than 4 years ago | (#26642743)

I have one of the "original" EEE PCs. The 701 4G. Even with the stock 512Meg RAM and the underclocked to 670MHz CPU, I can do pretty much everything I want to do. The limitation here being the screen (Thank you slashdot for the "previous track icon" on the front page!)

Let's see: I do Word Processing (AbiWord), Spreadsheets (Gnumeric), Browsing (Iceweasel), Email (Icedove), and many things more. All courtesy of the Debian EEE Project. (I run LXDE and the afore mentioned applications). For day-to-day personal stuff this is plenty.

When it still ran Xandros, I even installed Eclipse for fun and kicks. While it wasn't that fast, it was doable and after configuring the interface minimally, it was ever halfway usable. Granted, I wouldn't want to use it over 8 hours.

I hate to be the "640KByte is enough for everyone" guy, but a 1GHz machine with 512Meg RAM and a lean operating system is indeed enough for most uses.

That also explains my wifes machine: a P-IV 2.6GHz Hyperthreaded which was bought in 2003 and still is our primary and only desktop. (Had a few minor upgrades...) We have no intent of changing it. It runs Windows XP Pro.

Faulty on many fronts (1)

defile39 (592628) | more than 4 years ago | (#26642757)

1) No one is following Moore's law. It's a description of what happens.

2) You can, of course, come up with some equation that describes the cost of a set amount of processor power over time.

3) This article and this summary make bad economic assumptions and use faulty logic. I suggest to all reading the comments that it's not worth reading.

That is all.

Re:Faulty on many fronts (2, Funny)

RagingFuryBlack (956453) | more than 4 years ago | (#26643139)

You just admitted to RTFA. This is /. Nobody RTFA here. Turn in your /. ID at the door. Thank you.

And now it's up to software (1)

Leptok (1096623) | more than 4 years ago | (#26642761)

To put all those idle resources to good use. There needs to be a new generation of programs and games to put all the hardware to work.

Multiseat, LSTP, thin client (1, Insightful)

Anonymous Coward | more than 4 years ago | (#26642811)

There are many options. Multiseat, Linux Terminal Server Project (LTSP), thin clients (netbook and barebone desktop), etc. In other words, the return of time-sharing evolved and using hi tech.

Multiseat
http://en.wikipedia.org/wiki/Multiseat_configuration [wikipedia.org]

LSTSP
http://en.wikipedia.org/wiki/Linux_Terminal_Server_Project [wikipedia.org]

thin client
http://en.wikipedia.org/wiki/Thin_client [wikipedia.org]

netbook
http://en.wikipedia.org/wiki/Netbook [wikipedia.org]

Computer terminal
http://en.wikipedia.org/wiki/Computer_terminal [wikipedia.org]

etc...
http://tech.yahoo.com/blog/null/24753 [yahoo.com]

A low end computer is enough for 99% of the work. Almost nobody need or use a 4 core CPU (except for games), but usually, this kind of power is not used in the enterprise.

Incorrect about Moore's law (5, Informative)

rminsk (831757) | more than 4 years ago | (#26642817)

Moore's law does not say "that the amount of computing power available at a particular price doubles every 18 months." Moore's law says that the number of transistors that can be placed inexpensively on an integrated circuit increase exponentially, doubling approximately every two years.

Re:Incorrect about Moore's law (1)

Tibor the Hun (143056) | more than 4 years ago | (#26643207)

Yes, but you don't expect anyone on a site like this to actually know your random "factoid".

What we would also like to see is how many computing powers such as this equal a standard US LOC?

Windows 7 Will Set Us Free (1)

agrestic (1434457) | more than 4 years ago | (#26642831)

Darien Graham-Smith: "Vista has had us driving with the handbrake on for the past two years, but at long last Windows 7 is coming to set us free." .....I hope this doesn't turn out like the transition from Hoover to FDR.

Xen would have been a better comparison? (0)

Anonymous Coward | more than 4 years ago | (#26642853)

Contrasting Xenserver [360is.com] with VMWare [vmware.com] would have been a better comparison. Xen have gone down the super-thin hypervisor route with only a few tens of thousands of lines of code in their core software, the rest plugged in via API. This is in contract to the integrated bigger approach by the existing market leaders.

AG

Microsoft got told off (1)

w0mprat (1317953) | more than 4 years ago | (#26642941)

There is a possibility (however unlikely) that Microsoft has made a very clever and astute move going for a streamlined version of their current technology rather than a new iteration with whatever arbitrary new features weighing it down. It's clever because it's timed for release in a global recession and a switch in focus to developing markets - the 'next billion' users.

But it's unlikely, and I would hesitate to say microsoft has actually preempted anything, I'd say their responding to what they've been told by hardware vendors who have put their foot down and said, "this is what we will be selling, if you don't come up with something we're going to take matters into our own hands with linux".

they needed that last part! (1)

ILuvRamen (1026668) | more than 4 years ago | (#26642955)

My 8 year old laptop running ME can boot in 14 seconds and shut down in just under 4. I think XP and Vista were a big step in the wrong direction. When a set of hardware that's overall about 30x faster still takes at least 5x longer to boot XP or Vista, somebody screwed up. It's about time operating systems got thinner and more efficient regardless of the economy of stalling hardware speeds.

Hmm... (3, Interesting)

kabocox (199019) | more than 4 years ago | (#26642971)

I can buy a $350 mini laptop, $500 decently speced laptop, or a $500 desktop with what would have been unbelievable specs not long ago. I remember when I picked up computer shopper and was thrilled that there were any bare bones dekstops that sold at the $1K mark. Now you can get full featured systems for under .5K that do things that $2-3K machines couldn't do.

Really, there is no such thing as a "Moore's Law." It's Moore's trend lines that have been holding. That it lasted 10 years, much less this long has been utterly amazing. I fully expect for us to run into problems keeping with "Moore's Law" before 2100. 5-10 years after the trend is broken it'll be something the future folks will either forget about it entirely or look back and kinda giggle at us like we were just silly about it all. 50-100 years later no one will care though every one will be making use of the by products of it. Do you notice where the stuff for roads comes from or what Roman engineer built the most or best roads? That's generally what they'll think of any computing device older than 20 years. If Moore's law holds until 2050, every computing device that we've currently made will be either trash or museum pieces by that time. Heck, you have people getting rid/upgrading of cell phones almost every 3-6 months already.

We imagine replicators in Star Trek, but we don't need them with Walmart and 3-6 months for new products to come out. Consider Amazon+UPS next day shipping. Replicator tech would have to be cheaper and faster than that to compete. I think that it's more likely that we'll keep on improving our current tech. What happens when UPS can do 1 hour delivery to most places on the globe? Replicators might spring up, but only for the designers to use them to spend a week making 10K of a unit, to put it went on sale today, which would be sold out in two weeks and discounted by the week after. Face it; we are already living in a magical golden age. We just want it to be 1000x better in 50 years.

Re:Hmm... (2, Insightful)

roe-roe (930889) | more than 4 years ago | (#26643351)

While I can appreciate your sentiment, you *can't* get a decent laptop for $500. You can get a laptop that will run XP or GNU/Linux or *BSD for $500. But the world uses Windows, and if you are going to be running Vista well, you are looking at $800 for the laptop. And while, that is phenomenal, TFA is trying to convey that over the next few months they want to take the $800 laptop and make it cost $500, and that $500 desktop to cost $400. Industries hurting now don't care where we are going to be in 100 years or even how far we have come in 10. The industry has been chasing this ever increasing sliding scale of performance. Consumers have benefited by getting more powerful machines.

Oddly enough, Moore's observations are still viable, but it is the economy that is going to slow the trend. Demand is shifting from the same price point to one lower. This will cause a momentary dip in the trend. Once the new price point stabalizes Moore's Law will again be relevant.

Eh? (0)

Anonymous Coward | more than 4 years ago | (#26643015)

I could care less for having "the latest (and not necessarily the greatest)". I still have a single core Sempron 3500+ (socket 939), and guess what? I don't feel the need to upgrade. It plays the games I like (CSS/TF2), runs the apps I use, all without problems. Plus, when I do finally "upgrade" to a dual core in a few years, today's games will be cheaper, and cracks'll be available for em, so I won't need to have that dang fangled DRM.

Vista deserves credit... (4, Insightful)

xs650 (741277) | more than 4 years ago | (#26643027)

Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version.

Without Vista, MS wouldn't be able to claim that 7 was faster than their previous version of Windows.

Re:Vista deserves credit... (1)

s_p_oneil (795792) | more than 4 years ago | (#26643519)

I couldn't have said it better myself. My favorite early "indicator" of Vista performance was all those Crucial ads on Slashdot that essentially said "Vista is a memory hog, buy more Crucial RAM". Did anyone else here think those were hilarious?

More is More (4, Interesting)

Archangel Michael (180766) | more than 4 years ago | (#26643157)

One of the things I learned many years ago, is that computer and computing speed isn't a function of how fast something runs. Rather it is a matter of whether or not you actually run something.

If computer speeds are twice as fast, and it currently takes you ten seconds to accomplish Task A, and a new computer will allow you to accomplish that same task in 5 seconds .... getting a new computer is not that big of a deal.

However, if you run Task B, which takes 1.5 hours to complete, and a new computer will run that same task in say 4 minutes (Real world example from my past, log processing), the difference isn't necessarily the 86 minute difference, but rather if and how often you actually run that task.

It is endlessly amusing to see "real world benchmarks" that run in 3 minutes for most processors, separated by less than 2 x. Or frames per sec. Or ...... whatever.

When shit takes too long to get done, you tend NOT to do it. If the difference is a few seconds, that is nice and all, and a few seconds may be of interest to "extreme" hobbyists.

But Real World differences are not marginally decreasing from 300 to 279 seconds. Sorry, but those extra few seconds aren't going to prevent you from running that Task.

The true measure is not how fast something gets done, but whether or not you actually do the task, because the time involve is prohibitive.

Hardware and Software (1)

rijrunner (263757) | more than 4 years ago | (#26643275)

The reality is that hardware has pulled so far ahead of software, it will be years before we exploit out current level of technology to its capacity.

    We have some apps that don't understand how to task between CPU's. (We have some OS's that barely grasp that). We have applications that were designed in a time of 16 bit machines and fairly low limits on memory that have been patched and slowly moved along when they really need a completely new architecture underneath now to function well. We have some software companies that have pressed the limits in terms of graphics or number crunching, but that has not had time to diffuse through the overall industry.

    Within a few years, we will have computer architectures that have an arbitrary number of processor cores and no real apps to handle that. We have a lot of stuff that was in enterprise level boxes that have worked their way down into desktops and laptops.

    How about we get some balance back. The current technology has been barely scratched in terms of its capabilities.
 

And this is new? (1)

nick_davison (217681) | more than 4 years ago | (#26643463)

The first 486 I got my hands on came with a $5,000 price tag.

My first Pentium came in, well spec'd, around $2,500.

The PCs I was building for myself ran about $1,500 five years ago and the last one was down around $1,100 - all "mid-range" machines, capable of playing the latest games in fairly high quality and reasonably functional for at least 18 months to 2 years.

Since a little after that Pentium, the systems I see more casual friends buying have dropped from few people buying $3,000 laptops to a fair number of people getting $2,000 laptops to most of them having $1,000 laptops, to $699 Circuit City or BestBuy "deals" to $300 netbooks.

When Moore came up with his law, in the mid 60s, what was the single cheapest usefully functional computer you could buy? I wasn't buying then but I'm guessing several hundred thousand dollars and it took up half the room. From day one, people have been trading processor cycles for cost.

Moore's law is already busted (1)

Renegade Iconoclast (1415775) | more than 4 years ago | (#26643475)

Adding transistors doesn't matter, it's the software that matters.

Multi-CPU machines can only be as efficient as the software that drives them, and right now, the interaction between the hardware and software is complex and poorly understood by engineers (or at least, normal programmers like myself). There's a lot of unnecessary experimentation involved in getting something to utilize the hardware efficiently, because it's very much an exercise left to the reader (though I'd be pleased to be proven wrong with a kickass resource).

Many applications just use one of your 8 cpus. In fact, Moore's law has already been busted, because computers have not gotten faster, they've just gotten more chips.

I expect a huge paradigm shift in programming to take advantage of now ubiquitous multi-CPU systems, but it hasn't happened yet, not even close.

what Microsoft Giveth (1)

Matey-O (518004) | more than 4 years ago | (#26643479)

Flash taketh away.

Rant against Moore's law (0)

Anonymous Coward | more than 4 years ago | (#26643485)

Ok this has always annoyed me so it's time to finally rant about it.

As I see it, this guy Moore, who seems like a fine guy otherwise, has been made utterly FAMOUS for, well, stating the fucking obvious! "Computer speeds will double every year" is about as clever an observation as "human population will increase every year" (with similar limitations). In fact, whenever you talk about population increase, would you start referring to it as "AC's Law"? That would make me happy and have a nice irony to it.

Ok, I suppose it helps that he co-founded Intel or whatever.

Thatisallthankyou.

Actually (1)

rgviza (1303161) | more than 4 years ago | (#26643501)

by Gordon Moore that the amount of computing power available at a particular price doubles every 18 months

-------------
BZZZZ wrong answer
Moore's original law states that: the number of transistors we are able to pack into a given size of silicon real estate inexpensively, doubles every 18 months. He changed this prediction to every 2 years in 1975, which bolstered the perceived accuracy of his prediction.

Number of transistors for a particular price is a moving target which is entirely dependent on the supply and demand of said silicon, how much the R&D for the die cost, perfecting the process, how much waste there is etc etc etc. This is a subtle, yet very important, distinction.

You can pay $1200 for a bleeding edge CPU, or you can wait a few months and pay $400.

While the raw silicon has a relatively stable price, the process and it's requirements are a variable cost which is depreciated over the life of the process. This is why finished retail CPUs drop rapidly in cost over time.

The r&d and manufacturing cost per unit area actually goes up over time.

-Viz

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...