Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Dual-core Systems Necessary for Business Users?

CowboyNeal posted more than 8 years ago | from the techno-overkill dept.

398

Lam1969 writes "Hygeia CIO Rod Hamilton doubts that most business users really need dual-core processors: 'Though we are getting a couple to try out, the need to acquire this new technology for legitimate business purposes is grey at best. The lower power consumption which improves battery life is persuasive for regular travelers, but for the average user there seems no need to make the change. In fact, with the steady increase in browser based applications it might even be possible to argue that prevailing technology is excessive.' Alex Scoble disagrees: 'Multiple core systems are a boon for anyone who runs multiple processes simultaneously and/or have a lot of services, background processes and other apps running at once. Are they worth it at $1000? No, but when you have a choice to get a single core CPU at $250 or a slightly slower multi-core CPU for the same price, you are better off getting the multi-core system and that's where we are in the marketplace right now.' An old timer chimes in: 'I can still remember arguing with a sales person that the standard 20 Mg hardrive offered plenty of capacity and the 40 Mg option was only for people too lazy to clean up their systems now and then. The feeling of smug satisfaction lasted perhaps a week.'"

cancel ×

398 comments

You've got more threads than you might think... (5, Insightful)

gbulmash (688770) | more than 8 years ago | (#14984960)

The key quote here, IMO, is: "Multiple core systems are a boon for anyone who runs multiple processes simultaneously and/or have a lot of services, background processes and other apps running at once."

All the anti-virus, anti-spyware, anti-exploit, DRM, IM clients, mail clients, multimedia "helper" apps, browser "helper" apps, little system tray goodies, etc., etc., and so on, it can start to add up. A lot of home and small business users are running a lot more background and simultaneous stuff than they may realize.

That's not to say these noticeably slow down a 3.2GHz single-core machine with a gig of RAM, but the amount of stuff running in the backgrownd is growing exponentially. Dual core may not be of much benefit to business users now, but how long will that last?

- Greg

Spend the extra money on flash-cache (2, Insightful)

dsginter (104154) | more than 8 years ago | (#14985016)

I'd rather spend the extra $750 on flash cache memory for the hard drive. Or, just replace the hard drive altogether [eetimes.com] . I gurantee either of these would win the average Business Joe's pick in triple blind taste test.

Re:Spend the extra money on flash-cache (1)

NoMoreNicksLeft (516230) | more than 8 years ago | (#14985117)

Yes. And at 80,000 writes, the drive is dead. Nice. Your 100gig solid state drive will be a 40gig drive (and fragmented at that) in a year.

Do you own semiconductor stocks or something?

Re:Spend the extra money on flash-cache (1)

Mike Savior (802573) | more than 8 years ago | (#14985191)

You fail to see that the point of CACHE is to CACHE. So a 40 gig drive with a 256 mb cache is still a 40 gb drive. Also, solid state devices are -supposed- to survive "millions" of write cycles, or at least, hundreds of thousands, before dying. ymmv.

Re:Spend the extra money on flash-cache (2, Informative)

EnderWiggnz (39214) | more than 8 years ago | (#14985317)

all flash sold today contains wear levelling technology. the lowest that i've seen is 400,000 writes mtbf.

actualy most mau's dont even list that anymore, and give hours of use.

and... flash has been this way for at least 5 years.

Re:Spend the extra money on flash-cache (1)

mrchaotica (681592) | more than 8 years ago | (#14985336)

(and fragmented at that)
<Morbo>FLASH MEMORY DOES NOT WORK THAT WAY!</Morbo>

Re:You've got more threads than you might think... (5, Interesting)

The Crazed Dingus (961915) | more than 8 years ago | (#14985020)

Its true, I recently took a look at my own systems running processes, and while it only shows four or five icons in the system tray, i ended up showing that i have almost 50 backround apps running, and to boot almost 1500 process modules running. This is way up from a year ago, its something that is coming to the forefront that multi-core processers are going to become the norm.

Overkill Dragging Customers Along (5, Informative)

Anonymous Coward | more than 8 years ago | (#14985050)

Obviously, few (if any) business users need anything more than a Pentium III running at 500 MHz. That processor is perfectly acceptable for business applications like OpenOffice.

Unfortunately, ultimately, most business users will be forced to upgrade to new systems simply because there will no longer be replacement parts for the old systems.

Consider the case of memory modules. 5 years ago, 64MB PC100 SODIMMs were plentiful. Now, they are virtually extinct. By 2010, you will not be able to find any replacement memory modules for your 1999 desktop PC because it requires PC100 non-DDR SDRAM, and no one will sell the stuff. In 2010, the only thing that you can buy is DDR2 SDRAM, Rambus DRAM, or newer-technology DRAM.

In short, by 2010, you will be forced to upgrade for lack of spare parts.

Re:Overkill Dragging Customers Along (2, Funny)

Andrew Tanenbaum (896883) | more than 8 years ago | (#14985150)

OpenOffice on a P3 500? I feel sorry for you.

I can't even tolerate its glacier like performance on my Dual Xeon system with 8 gigabytes of RAM.

Re:Overkill Dragging Customers Along (2, Informative)

Anonymous Coward | more than 8 years ago | (#14985176)

> few (if any) business users need anything more than a Pentium III running at 500 MHz.

The processor is acceptable, but the hard drives and RAM subsystems typically found in machines of that era are not. The Intel 915 board topped out at 512MB of slow SDRAM, and the 20GB disks found in those machines have horrendus seek times.

Since most companies did not buy multi-thousand-dollar workstations for their desktops back in 1998 or whenever, the fact is that older machines simply can not handle the typical 2006 load-out of office suite/groupware/anti-virus/Firefox/KDE/or what have you.

Re:You've got more threads than you might think... (1)

mnmn (145599) | more than 8 years ago | (#14985144)

I setup an OpenBSD system in vmware and tried to trim it down as much as possible. Init would just exec bash and nothing more. There were still some kernel processes. I realized you cant have a monoprocessor OS today beside DOS.

So beside DOS, anything else can utilize the second core. As far as its feasibility goes, if it costs twice as much as a monocore, its not worth it. If it costs 20% more, well worth it.

Re:You've got more threads than you might think... (1, Informative)

Anonymous Coward | more than 8 years ago | (#14985169)

All the anti-virus, anti-spyware, anti-exploit, DRM, IM clients, mail clients, multimedia "helper" apps, browser "helper" apps, little system tray goodies, etc., etc., and so on, it can start to add up.

With dual processors, Norton AV is grabbing more than 100% CPU at times, even with their own throttling settings enabled.
Maybe Symantec wants us to turn off scanning compressed files. I won't try, nor will I ever put them on a 4 CPU system.

Re:You've got more threads than you might think... (1)

jamesh (87723) | more than 8 years ago | (#14985231)

But how many of those threads are CPU bound? The moment you start doing any number crunching then (assuming code written to take advantage of it and only up to a limit) the more CPU's the merrier, but no amount of extra CPU is going to get that data off the disk faster.

That being said though, if you have enough memory to hold your entire SQL/Mail/whatever database in memory, you might start to see the benefits of multiple cpu cores for read oriented queries.

A cool tool would be one that watches system activity over the course of a day/week/month and figures out what system improvement (CPU/Memory/Disk) is going to benefit you the most, based on the time threads spend waiting for swap, disk data, or CPU time.

Re:You've got more threads than you might think... (0)

Anonymous Coward | more than 8 years ago | (#14985249)

All the anti-virus, anti-spyware, anti-exploit, DRM, IM clients, mail clients, multimedia "helper" apps, browser "helper" apps, little system tray goodies, etc., etc., and so on, it can start to add up. A lot of home and small business users are running a lot more background and simultaneous stuff than they may realize.

Don't forget the all important print spooler! heh

Yes, multi-core is the way to go. There is no such thing as only having one program in memory any more, those days died a looong time ago with DOS and CP/M. Besides, it's the only option if we want faster CPUs any way because chip fab technology has hit a clock speed wall of sorts. So now parrellelization (is that a word?) has to be the main focus. If you ask me we should have started moving in this direction a long time ago, but at least now your every day windows programmer (most unix guys already get this, we have for DECADES) will finally be forced to consider threading and realize they are in a multi-tasking environment!! Use your FORK and stop playing with your food! You're getting sloppy code all over the table...

How ever, I will agree that the high end processors in the consumer market are over kill for most current business apps. The key word here is "current", some day we will need that power to take advantage of the latest and greatest in business apps. Or, if current microsoft bloat code standards remain in effect, we will need that power in 2007 just to boot vista and run outlook to check email! heh

Yes, very (4, Funny)

Anonymous Coward | more than 8 years ago | (#14984965)

Also, that 30 inch monitor is also very important.

Re:Yes, very (3, Informative)

thrillseeker (518224) | more than 8 years ago | (#14985008)

Also, that 30 inch monitor is also very important.

I'm holding out for one of these [hankooki.com] .

Re:Yes, very (4, Funny)

wideBlueSkies (618979) | more than 8 years ago | (#14985022)

Of course, being the typical /. geek, you're referring to the huge screen, not the girls, right? :)

wbs.

Re:Yes, very (0)

Anonymous Coward | more than 8 years ago | (#14985088)

Honestly, in that picture the screen is more interesting - except perhaps for sex-starved desperados. The girls might have been more interesting in ways the picture can't communicate, but that's irrelevant to the picture :-)

Re:Yes, very (2, Funny)

wideBlueSkies (618979) | more than 8 years ago | (#14985109)

What that needs to be is an LCD/HDTV enabled waterbed..... It's certainly big enough.... Imagine being able to watch pron, in the bed, with the models?

God, I need to get some soon.....

wbs.

Re:Yes, very (1)

gstoddart (321705) | more than 8 years ago | (#14985194)

Of course, being the typical /. geek, you're referring to the huge screen, not the girls, right? :)

*phbhbhbhtt* Screw that. I'll keep squinting at my dual 19" monitors. I'm holding out for the girls.

Of course, I stand a better chance at the 100" LCD, but I can still hope.

Re:Yes, very (1)

AndreiK (908718) | more than 8 years ago | (#14985179)

Yes, I so want that.

Oh, and the monitor is cool too.

Think Big (1)

mfh (56) | more than 8 years ago | (#14985036)

Dual 30" monitors. KVM'd to top systems running Quad-SLI Geforce 7900's [slashdot.org] . dr0000l...

Wait a minute. Why stop at 30"? Plasmas [hitachi.us] !

Re:Yes, very (1)

DextroShadow (957200) | more than 8 years ago | (#14985160)

My porn looks best in high res.

Storage size (1, Funny)

EdMcMan (70171) | more than 8 years ago | (#14984972)

'I can still remember arguing with a sales person that the standard 20 Mg hardrive offered plenty of capacity and the 40 Mg option was only for people too lazy to clean up their systems now and then. The feeling of smug satisfaction lasted perhaps a week.'

If you build it, they will fill it.

Re:Storage size (0)

Anonymous Coward | more than 8 years ago | (#14985281)

If you build it, they will fill it

...with pr0n

Storage size-One tire. (0)

Anonymous Coward | more than 8 years ago | (#14985305)

"If you build it, they will fill it."

The pothole in front of my house disagrees with you.

I don''t agree either. (3, Insightful)

Saven Marek (739395) | more than 8 years ago | (#14984980)

for the average user there seems no need to make the change. In fact, with the steady increase in browser based applications it might even be possible to argue that prevailing technology is excessive.'

I definitely don't agree. I remember hearing the same rubbish comments in various forms from shortsighted journos and analysts when we were approaching cpus with 50mhz. then I heard the same creeping up to 100mhz then 500mhz then 1ghz.

It is always the same. "The average user doesn't need to go up to the next $CURRENT_GREAT_CPU because they're able to do their average things OK now". Of course they're able to do their average things now, that's why they're stuck doing average things.

Re:I don''t agree either. (1)

Eightyford (893696) | more than 8 years ago | (#14985038)

I definitely don't agree. I remember hearing the same rubbish comments in various forms from shortsighted journos and analysts when we were approaching cpus with 50mhz. then I heard the same creeping up to 100mhz then 500mhz then 1ghz. It is always the same. "The average user doesn't need to go up to the next $CURRENT_GREAT_CPU because they're able to do their average things OK now". Of course they're able to do their average things now, that's why they're stuck doing average things.

The only reason business users need fast cpus is because many programmers these days are stupid and lazy.

Re:I don''t agree either. (2, Insightful)

Poltras (680608) | more than 8 years ago | (#14985093)

What if it is true? My mom does not need to play Doom III. My cousin does not need to load 500 things in the background (such as QoS and scheduler, great services of course...). My grand father just wants to play cards with friends over the internet, while his wife wants to print recipes. That's average things, and they ask a computer to do them. I don't want them to be blasted off by a great Aero Glass window border, because they can put that saved money elsewhere (notably in banks, so that I can have it when they die... muhahaha :P). Why would they do so?

Same applies elsewhere... I bought my car (Yaris) for gaz saving (because the price in quebec is waaaay too high), not for speed. I don't need speed, I just go to work day and come back night, with a bit of camping on weekends and the usual downtown parties. Tell me, why would I buy the latest ferrari when I can put my saved money in something else, such as buying a new computer (i'm a geek and i play games and reverse hashes... oh wait)? Would you?

Re:I don''t agree either. (1)

discstickers (547062) | more than 8 years ago | (#14985263)

Because a Ferrari is FUN.

Re:I don''t agree either. (2, Insightful)

JanneM (7445) | more than 8 years ago | (#14985097)

I'd like to rephrase it as "The average user does not need the $CUTTING_EDGE_STUFF because the $CURRENT_CHEAP_LOWER_END will run all they want to do just fine for the next few years."

In, say, three years, when dual core systems are slowly entering the low end, it makes sense for business users (and, frankly, the vast majority of users in general) to get it. Right now, dual core is high end stuff stuff, with the price premium to prove it. Let the enthusiasts burn their cash on it, but for businesses, just wait another generation.

You're not leasing sports cars for your salesforce, you're not getting Mont Blanc pens for your office workers, why should you pay a premium on electronics that doesn't do anything for productivity either?

Re:I don''t agree either. (1)

mqtthiqs (963238) | more than 8 years ago | (#14985119)

How did the needs of the buisness users changed these last 10 years? What function of Word that wasnt available in Word 6.0 and is now requires this insane increase of performance need? One more question: what is the part of the resources needed by effective code (the word window) in comparison with the one that checks the effective code (antivirus, anti-spyware...)? These average buisness users don't need more than they already in 1995 had!

It's a flocking behaviour... (3, Insightful)

tlambert (566799) | more than 8 years ago | (#14985301)

It's a flocking behaviour... and you *must* take it into account when choosing software.

Q: "What function of Word that wasnt available in Word 6.0 and is now requires this insane increase of performance need?"

A: The ability to open and read documents sent to you by third parties using the newer tools.

For example, when your lawyer buys a new computer, and installs a new version of Office, and writes up a contract for you, you are not going to be able to read it using your machine running an older version of the application. And the newer version doesn't run on the older platform.

Don't worry - the first copy of a program that has this continuous upgrade path lock-in is free witht he machine.

-- Terry

Re:I don''t agree either. (1, Funny)

1u3hr (530656) | more than 8 years ago | (#14985216)

It is always the same. "The average user doesn't need to go up to the next $CURRENT_GREAT_CPU because they're able to do their average things OK now". Of course they're able to do their average things now, that's why they're stuck doing average things.

As opposed to what non-average things?

I upgraded from an 850 MHz Centris to a 2.4 GHZ Athlon a few months ago when the old mobo died; I don't see any noticeable difference in performance except video, which is a different matter. And I do DTP, more demanding than the average office paper work. As for gaming, Freecell seems about the same too.

The old timer's right - it's a stupid argument (2, Insightful)

strider44 (650833) | more than 8 years ago | (#14984990)

It's inevitable. The more resources we have, the more we're going to want to use. That goes for basically everything - it's just human nature.

resource usage (1, Insightful)

Anonymous Coward | more than 8 years ago | (#14985080)

Another quote relates to spending.

Ones "necessary expenses" always grow to meet ones income.

Re:The old timer's right - it's a stupid argument (1)

bstadil (7110) | more than 8 years ago | (#14985293)

Good point

Think about it next time someone argues for the need to increase the military budget. Keeping America strong actually makes war more likely as we have seen plus doesn't really make anyone safer incl ourselves

Not really (2, Informative)

Theatetus (521747) | more than 8 years ago | (#14984995)

Multiple core systems are a boon for anyone who runs multiple processes simultaneously and/or have a lot of services

Not really. It all depends on your scheduler. There's just no telling without testing if a given application / OS combination will do better or worse on dual-core.

Remember, two active applications, or two threads in an active application, does not mean those two processes or threads get to be piped to separate cores or processors. That might possibly happen but it probably won't.

I had a boss who loved to get dual-CPU systems. Why? "Because that way one CPU can run the web server and one CPU can run the database." No matter how often I tried to shake that view from his head it never left. (In point of fact, both were context switching in and out of both CPUs pretty regularly).

In short: dual core, like most parallelized technologies, doesn't do nearly as much as you think it does, and won't until our compilers and schedulers get much better than they are now.

Re:Not really (1)

Joe123456 (846782) | more than 8 years ago | (#14985045)

You should of said the 2 cpus is better and you can even get 2 or more duel core amd cpus.

Re:Not really (2, Informative)

markov_chain (202465) | more than 8 years ago | (#14985061)

I had a boss who loved to get dual-CPU systems. Why? "Because that way one CPU can run the web server and one CPU can run the database." No matter how often I tried to shake that view from his head it never left. (In point of fact, both were context switching in and out of both CPUs pretty regularly).

Those are not exactly CPU-hungry applications that could take advantage of multiple CPUs. No scheduler in the world will help run a webserver and database better on that machine if the I/O subsystem is the bottleneck.

Re:Not really (2, Informative)

iamacat (583406) | more than 8 years ago | (#14985083)

Last I remember, you could assign processor affinity in Windows task manager to really run database on one CPU and web server on another, if that's what gives you the best performance. Of course the main point is that CPU-intensive code from both processes (say sorting in the database and image rendering in web server ) can run simultaneously. What's exactly wrong with your boss's point of view?

Re:Not really (1)

Matthew Weigel (888) | more than 8 years ago | (#14985229)

What's wrong is that the database and web server are probably not contending for CPU time anyway. They are both contending for disk and memory access.

Re:Not really (1)

iamacat (583406) | more than 8 years ago | (#14985284)

You actually store your database and your webpages on the same hard drive?

Re:Not really (1)

Matthew Weigel (888) | more than 8 years ago | (#14985341)

That doesn't matter.

They are contending for PCI bus bandwidth, disk controller bandwidth, and (like I said before) memory bandwidth. Either your needs are lightweight enough that storing your database and your web pages on the same disk are basically fine, or your needs are heavyweight enough that you'll get better performance for less by separating out the systems further.

Re:Not really (1)

shawnce (146129) | more than 8 years ago | (#14985096)

Remember, two active applications, or two threads in an active application, does not mean those two processes or threads get to be piped to separate cores or processors. ...but it does mean that those two "active" applications (with X number of threads) can have 2 threads execute concurrently, one on each core (assuming little resource contention taking place in other places).

(In point of fact, both were context switching in and out of both CPUs pretty regularly). So? If the both had the need of CPU resources at the same time they could likely get them at the same time since the system had dual processors.

Re:Not really (2, Insightful)

jevvim (826181) | more than 8 years ago | (#14985131)

In point of fact, both were context switching in and out of both CPUs pretty regularly.

But it didn't have to be that way; most multiprocessor operating systems will allow you to bind processes to a specific set of processors. In fact, some mixed workloads (although, admittedly, rare) show significant improvement when you optimize in this way. I've even seen optimized systems where one CPU is left unused by applications - generally in older multiprocessor architectures where one CPU was responsible for servicing all the hardware interrupts in the system.

dual core, like most parallelized technologies, doesn't do nearly as much as you think it does, and won't until our compilers and schedulers get much better than they are now.

Compilers are being held back by the programming languages chosen by developers. As hardware concurrency increases, the technology behind compilers for imperative and procedural languages (C, Pascal, Fortran, Java) shows just ill-suited it is take advantage of that power. Instead, we will need to move to new languages that will enable compilers to optimize for concurrency, much as circuit designers moved from alegbraic logic languages (ABEL, PALASM) to concurrent logic languages (VHDL, Verilog) with the transition from programmable logic devices to field programmable gate arrays.

Re:Not really (3, Interesting)

shawnce (146129) | more than 8 years ago | (#14985157)

In short: dual core, like most parallelized technologies, doesn't do nearly as much as you think it does, and won't until our compilers and schedulers get much better than they are now.

Yeah just like color correction of images/etc done by ColorSync (done by default in Quartz) on Mac OS X doesn't split the task into N-1 threads (when N > 1 and N being the number of cores). On my quad core system I see the time to color correct images I display take less then 1/3 the time it does when I disable all but one of the cores. Similar things happen in Core Image, Core Audio, Core Video, etc. ...and a much of this is vectorized code to begin with (aka already darn fast for what it does).

If you use Apple's Shark tool to do a system trace you can see this stuff taking place and the advantages it has... especially so given that I as a developer didn't have to do a thing other then use the provided frameworks to reap the benefits.

Don't discount how helpful multiple cores can be now with current operating systems, compilers, schedulers and applications. A lot of tasks that folks do today (encode/decode audio, video, images, encryption, compression, etc.) deal with stream processing and that often can benefit from splitting the load into multiple threads if multiple cores (physical or otherwise) are available.

Re:Not really (1)

Trogre (513942) | more than 8 years ago | (#14985193)

What are you talking about? That task bouncing problem you mentioned was fixed in the 2.6 kernel and wasn't really a major problem in 2.4 kernels.

If, though it's not likely, your bosses web server and DBMS were CPU-bound then without a doubt he'd see better performance on two cores with any modern scheduler worth its bits.

And yes, they would be running on one core each.

Filled to capactiy (1)

ozTravman (898206) | more than 8 years ago | (#14984998)

What about when 56k modems were fast enough for everyone. The capacity of applications will always grow to meet and exceed the available capacity to it.

56K? (1, Insightful)

khasim (1285) | more than 8 years ago | (#14985141)

What about when 56k modems were fast enough for everyone.
56K? Son, most people won't read fast enough to keep up with 1,200 baud.
The capacity of applications will always grow to meet and exceed the available capacity to it.
I think you're onto something there.

But I don't think it applies to the single/dual core issue.

I don't think any of the bottlenecks right now are processor related. Most of the issues I see are bandwidth to the box and graphics.

Which would you prefer:
#1. A second proc at the same speed as your current proc?

#2. A second pipe (LAN or Internet) at the same speed as your current pipe?

Assuming that the machine/OS/apps can fully utilize either option.

There are very few systems I've ever seen that ever hit a processor bottleneck ... that are not BROKEN at the time. Endless loops don't count.

I'm all in favour of the development of inexpensive, multi-core procs. Even for the desktop. Even for them becoming the standard on the desktop. Because I don't know what cool new functionality will be available tomorrow.

But from what I see right now, the limitation is how fast I can get data to the single proc I'm running today.

2x the processor power
or
2x the pipe?

Re:Filled to capactiy (1)

jer2eydevil88 (960866) | more than 8 years ago | (#14985151)

56k was never fast enough.. it was just affordable enough!

ISDN and DSL were just out of reach for the average joe but people still wanted it.

Re:Filled to capactiy (1)

mooingyak (720677) | more than 8 years ago | (#14985325)

What about when 56k modems were fast enough for everyone.

That would have been when exactly?

Anyone else notice the difference? (0)

Anonymous Coward | more than 8 years ago | (#14985000)

Ten years ago, people were constantly going "in five years, computers will be so fast that they'll never be able to get any faster!".

Cut to ten years later. Computers are still getting faster all the time. But meanwhile, people are going "for the last five years, computers have really been so fast that we don't really need them to get any faster!".

Well, don't worry, I'm sure Microsoft will come to the rescue. You may not think you need dual core systems now, but believe me, once you find yourself for whatever reason running Vista, you'll suddenly find that however much horsepower you have isn't enough.

40 Mg? (4, Funny)

EXMSFT (935404) | more than 8 years ago | (#14985003)

Even my oldest hard drives weighed more than that.

He may be an old timer - but I would think even the oldest old timer knows that MB = Megabyte...

Re:40 Mg? (3, Funny)

hobbesmaster (592205) | more than 8 years ago | (#14985075)

I doubt they weighed more than 40,000 Kilo-grams. :)

Re:40 Mg? (3, Informative)

MyLongNickName (822545) | more than 8 years ago | (#14985197)

As above poster notes Mg is 1,000,000 grams, not 1/1000.

Re:40 Mg? (0)

Anonymous Coward | more than 8 years ago | (#14985218)

40 megagrams = 40,000 kilograms. Pretty heavy (on Earth.)

Re:40 Mg? (0)

Anonymous Coward | more than 8 years ago | (#14985290)

I was wondering where you could get hard drives made of Magnesium.

Re:40 Mg? (1)

LadyLucky (546115) | more than 8 years ago | (#14985298)

40 Mg is 40 Megagrams - which is 40 Tons.

VISTA (1)

Joe123456 (846782) | more than 8 years ago | (#14985004)

They will need it when they are forced to use vista.

Of course it's not necessary (4, Informative)

Sycraft-fu (314770) | more than 8 years ago | (#14985023)

In general, for office productivity type stuff, processor speed isn't much of a problem. We find that older CPUs like 1.5GHz P4s are still nice and responsive when loaded with plenty of RAM, and we still use them. Office use (like Word, Excel, e-mail, etc) is a poor benchmark by which to decide how useful a given level of power is, since it usually lags way behind other things in what it needs. I mean an office system also works fine with an integrated Intel video card, but I can think of plenty of things, and not just games, that benefit or mandidate a better one.

Dual cores are useful in business now for some things, a big one I want one for is virtual computers. I maintain the images for all our different kinds of systems as VMs on my computer. Right now, it's really only practical to work on one at a time. If I have one ghosting, that takes up 100% CPU. Loading another is sluggish and just makes the ghost take longer. If I had a second core, I could work on a second one, while the first one sat ghosting. It also precludes me form doing much intensive on my host system, again, just slows the VM down and makes the job take longer.

Re:Of course it's not necessary (0)

Anonymous Coward | more than 8 years ago | (#14985185)

integrated Intel video card eat up systems ram way can't thay make them with there own ram + be able to take some of your system ram if needed?

Re:Of course it's not necessary (1)

GWBasic (900357) | more than 8 years ago | (#14985195)

I find that minor bumps in processor speed don't make Office, web, and email much faster. Dual core is desirable for the situations where one task maxes out a CPU for a short period of time, which does happen in Office, web, and email.

On my personal system, a busy process brings Outlook to its knees. If I had a dual-core system, I wouldn't have a problem.

Re:Of course it's not necessary (1)

NutscrapeSucks (446616) | more than 8 years ago | (#14985207)

I mean an office system also works fine with an integrated Intel video card, but I can think of plenty of things, and not just games, that benefit or mandidate a better one.

I'm curious what. Intel video works fine for most sorts of 2D graphics or video applications (photoshop, etc), and for professional 3D, you want a professional card. I guess what I'm getting at is that there's very little need for a consumer Nvidia/ATI card in a business system other than for games.

Re:Of course it's not necessary (0)

Anonymous Coward | more than 8 years ago | (#14985282)

I have one machine at work that I use for only two tasks: compressing audio, graphics, and video for building PowerPoint presentations (average 25MB .PPT file), and refreshing large Excel workbooks with multiple worksheets, multiple large queries in each sheet, and tons of formulas (average 40MB .XLS file). Yes, this is extreme usage of MS Office.

I used to have a "spare" 1.0GHz P3 IBM Netvista for this box, and if I started a presentation and a spreadsheet at the same time, I could expect to wait an hour. My personal laptop is a 2.0GHz P4 Thinkpad A31, and can do it in 16 minutes. I scrounged up a 2.0GHz Xeon HT HP x4000 and it could knock the same task out in 7 minutes. But the tops is the dual Xeon 1.8GHz motherboard I scavenged from a retired server that sits out on my test bench and refreshes the same two concurrent files in about 2 minutes.

Moral:
Dual Xeon > Xeon HT > P4M > P3

Re:Of course it's not necessary (1)

Bios_Hakr (68586) | more than 8 years ago | (#14985300)

Another concern is that a company does not buy a computer for a year. They buy something for 3 years. While I'd like to say we could save money by picking up some $500 whitebox computers, I can't. We'd be buying them every year. As it stands now, we buy the top-of-the-line Dell every 3 years. We may pay $5000 per box, but at least we get something that will still be usable in 3 to 5 years. Not to mention 24/7 support.

On top of all that, company software changes regularly. We may go through a few iterations of Office. Maybe one or two versions of Visio. May even get several versions of AutoCAD in the lifetime of one of our desktops.

Let's say I'm doing a refresh today. Should I buy a middle-of-the-road PC and save money? What if, in a year, we upgrade AutoCAD? And, just for the sake of making my argument better, let's say AC2007 requires Vista. Now, by saving $1000 on a lower-end PC, I've put myself out of the Vista upgrade path. Now I need to go back and get more RAM and a better video card.

I think, for most enterprise applications, going with top-of-the-line is the only option.

Oh, one other way to look at it:

A mid-range PC is $1500.

Let's buy 1000 of those. We have spent $1.5M.

If you add another $1000 for a kick-ass PC, you still only spend $2.5M.

That's chump change to a enterprise. Especially because they'll spread that over a 3-year contract.

You're absolutely right (4, Funny)

NitsujTPU (19263) | more than 8 years ago | (#14985027)

My goodness. I wonder often why people want nice new computer hardware at all. I, personally, am happy with my 8080. People who want new, fast computers are such idiots. Look who's laughing now. My computer only cost my $10, and I can do everything that I want on it.

Re:You're absolutely right (0)

Anonymous Coward | more than 8 years ago | (#14985124)

Yes, but people want to do things released today today, not three years from today.

Re:You're absolutely right (1)

rrohbeck (944847) | more than 8 years ago | (#14985246)

I, personally, am happy with my 8080.

And you're safe from viruses and worms under CP/M!

WordStar is probably more stable than Word too...

Re:You're absolutely right (1)

kadathseeker (937789) | more than 8 years ago | (#14985272)

Like not contain all of your movies, favorite tv shows, play Civ4 or Oblivion, run 40 tabs in Firefox, play all you music, run folding@home, run gaim with 5 chat tabs open, play dvds, run Photoshop, and be downloading pr0n all at the same time? I realize that not everyone works on their e-penis like that, but what do you DO on a $10 PC? IRC, Nethack, ASCII pr0n and...? If it works for you, then yes, you are smarter than those nitwits that buy Alienwares for Solitare but most geeks I know, even non-gamers, have a pretty decent machine to tinker with in all sorts of ways that does all sorts of crazy shit even if they have a cheapo *nix box as a server, HTPC, backup, work, or whatever.

I, for one, live in a digital Germany and NEED my Ferrari (okay, more like riced up Firebird or maybe Mustang) for this the virtual Autobahn on which I metaphorically drive.

Now I can run my spyware ... (5, Funny)

Hyram Graff (962405) | more than 8 years ago | (#14985031)

Multiple core systems are a boon for anyone who runs multiple processes simultaneously and/or have a lot of services, background processes and other apps running at once.

In other words, it sounds like it's perfect for all those people who wanted to get another processor to run their spyware on but couldn't afford the extra CPU before now.

1996 Called (4, Insightful)

wideBlueSkies (618979) | more than 8 years ago | (#14985047)

It wants to know why we need pentiums on the desktop. Why isn't a 486 DX fast enough?

wbs.

necessary or not (2, Funny)

MECC (8478) | more than 8 years ago | (#14985054)

They'll want them. Perhaps 'necessary' is not as relevant as 'desired'. Or 'Halo'.

nope (2, Insightful)

pintpusher (854001) | more than 8 years ago | (#14985060)

[out-of-context quote] prevailing technology is excessive.[/out-of-context quote]

I think its been said for years that the vast majority of users need technology at around the 1995 level or so and that's it. Unless of course you're into eye-candy [slashdot.org] or need to keep all your spyware up and running in tip-top condition. Seriously though, you know its true that the bulk of business use it typing letters, contracts, whatever; a little email; a little browsing and a handful of spreadsheets. That was mature tech. 10 years ago.

I run debian on an athlon1700 with 256 megs and its super snappy. of couse I use wmii and live by K.I.S.S. Do I need dual-core multi-thread hyper-quad perplexinators? nope.

I know. I'm a luddite.

Most folks DON'T need much HDD space... (4, Insightful)

Loopy (41728) | more than 8 years ago | (#14985062)

Really, consider the average business PC user. Outside of folks that have large development environments, do video/graphics/audio work, work on large software projects (such as games) really do not need 80GB hard disks. If you DO need more than that, you probably are quickly getting to the point of being able to justify storing your data on a file server. My unit at work only has 30GB on it, and that includes several ghost images of the systems I'm running QA on. Sure, grouse about Microsoft code bloat all you want but it doesn't take up THAT much HDD space.

Sweeping generalizations are rarely more than "Yeah, me too!" posts. /rolleyes

Re:Most folks DON'T need much HDD space... (3, Insightful)

MyLongNickName (822545) | more than 8 years ago | (#14985180)

Agreed. In fact, in any kind of multi-person office (or single for that matter), only PC software should be on the hard drive. No files. Anything of any importance should be saved to a network.

Whenever work has to be done on one of the office PCs, we do not give you the opportunity to transfer stuff off before we move it out. Lost a file? Go ahead, complain... you'll get written up for violating corporate policy.

Personal files? While discouraged, each user gets so much private space on the network.

Re:Most folks DON'T need much HDD space... (3, Insightful)

Anonymous Brave Guy (457657) | more than 8 years ago | (#14985343)

Agreed. In fact, in any kind of multi-person office (or single for that matter), only PC software should be on the hard drive. No files. Anything of any importance should be saved to a network.

That's nice. I've got about 2GB of automated tests I need to run before I make each release of new code/tests I write to source control. Running these from a local hard drive takes about 2 hours. Running them across the network takes about 10 hours, if one person is doing it at once. There are about 20 developers sharing the main development server that hosts source control etc. in my office. Tell me again how having files locally is wrong, and we should run everything over the network?

(Before you cite the reliability argument, you should know that our super-duper mega-redundant top-notch Dell server fell over last week, losing not one but two drives in the RAID array at once, and thus removing the hot-swapping recovery option and requiring the server to be taken down while the disk images were rebuilt. A third drive then failed during that, resulting in the total loss of the entire RAID array, and the need to replace the lot and restore everything from back-ups. Total down-time was about two days for the entire development group. (In case you're curious, they also upgraded some firmware in the RAID controller to fix some known issues that may have been responsible for part of this chaos. No, we don't believe three HDs all randomly failed within two days of each other, either.)

Fortunately, we were all working from local data, so most of us effectively had our own back-ups. However, this didn't much help since everything is tied to the Windows domain, so all the services we normally use for things like tracking bugs and source control were out anyway. We did actually lose data, since there hadn't been a successful back-up of the server the previous night due to the failures, so in effect we really lost three days of work time.

All in all, I think your "store everything on the network, or else" policy stinks of BOFHness, and your generalisation is wholly unfounded. But you carry on enforcing your corporate policy like the sysadmin overlord you apparently are, as long as you're happy for all your users to hold you accountable for it if it falls apart when another policy would have been more appropriate.

Re:Most folks DON'T need much HDD space... (1)

AndreiK (908718) | more than 8 years ago | (#14985215)

Business users? Yes, if they use more than 30GB on their computers, they are (probably) doing something seriously wrong.

On my home computer, however, I have over 500GB storage, and all but 4GB of it is full.

Re:Most folks DON'T need much HDD space... (1)

MaXiMiUS (923393) | more than 8 years ago | (#14985330)

But.. but.. then where would all my ROMs go to mommy!? .. mommy?

"...getting a couple [for the executives]..." (4, Interesting)

tlambert (566799) | more than 8 years ago | (#14985073)

"...getting a couple [for the executives]..."

I can't tell you how many times I've seen engineers puttering along on inadequate hardware because the executives had the shiny, fast new boxes that did nothing more on a daily basis than run "OutLook".

Just as McKusick's Law applies to storage - "The steady state of disks is full" - there's another law that applies to CPU cycles, which is "There are alwways fewer CPU cycles than you need for what you are trying to do".

Consider that almost all of the office/utility software you are going to be running in a couple of years is being written by engineers in Redmond with monster machines with massive amounts of RAM and 10,000 RPM disks so that they can iteratively compile their code quickly, and you can bet your last penny that the resulting code will run sluggishly at best on the middle-tier hardware of today.

I've often argued that engineers should have to use a central, fast compilation software, but run on hardware from a generation behind, to force them to write code that will work adequately on the machines the customers will have.

Yeah, I'm an engineer, and that applies to me, too... I've even put my money where my mouth was on projects I've worked on, and they've been the better for it.

-- Terry

Obligatory Dilbert Reference (3, Funny)

mad.frog (525085) | more than 8 years ago | (#14985081)

Wally: When I started programming, we didn't have any of these sissy "icons" and "windows". All we had were zeros and ones -- and sometimes we didn't even have ones. I wrote an entire database program using only zeros.

Dilbert: You had zeros? We had to use the letter "O".

Dual core at work? (1)

Mr.Ziggy (536666) | more than 8 years ago | (#14985089)

Unfortunately, the current dual core CPU options do not fit with 'typical' office workers needs/budgets.

Intel/AMD are pushing high speed, high cost CPU's and hoping someone will buy them for the office.

As other's have said, all the background widgets (anti-spam, spyware, virus, IM) apps could benefit from dual core and increase the user experience. BUT, they don't need the dual cores to:
-Run hot
-Be Expensive
-Run at High Speeds
-Raise the power bills

Personally, I've deployed and used some Athlon64 (single and dual) systems. I've liked the power management features and would like to see Sempron/Celeron versions of these chips a year from now to deploy to the average user.

As an aside: Do you think the widespread adoption of dual core systems will help Grid computer or interesting massive P2P type projects gain acceptance?

Re:Dual core at work? (1)

bigtrike (904535) | more than 8 years ago | (#14985153)

I'm not sure if this is true with AMD processors, but Intel chips actually seem to use less power at higher speeds while mostly idle. If I clock my 3.2GHz home system down to 300MHz, the power use increases by 10 or 20 watts. Supposedly this is due to the processors being really good at shutting down portions of the chip which are not in use.

Really simple math (2, Insightful)

zappepcs (820751) | more than 8 years ago | (#14985120)

The value of having faster hardware is more simple than all this cogitation would lead us to believe. If you spend 12 seconds of every minute waiting on something, that is 20% of your day. By decreasing this wait to 2 seconds, it greatly reduces waste: wasted manhours, wasted resources, wasted power....

It might seem trivial, but even with web based services that are hosted in-house, that 12 seconds of waiting is a LOT of time. Right now, if I could get work to simply upgrade me to more than 256MB of ram, I could reduce my waiting. If I was to get a full upgraded machine, all the better... waiting not only sucks, it sucks efficiencies right out of the company.

As someone mentioned, doing average things on average hardware is not exactly good for the business. People should be free to do extraordinary things on not-so-average systems.

Each system and application has a sweet spot, so no single hardware answer is correct, but anything that stops or shortens the waiting is a GOOD thing...

We all remember that misquote "512k is enough for anybody" and yeah, that didn't work out so well. Upgrades are not a question of if, but of when... upgrade when the money is right, and upgrade so that you won't have to upgrade so quickly. Anyone in business should be thinking about what it will take to run the next version of Windows when it gets here... That is not an 'average' load on a PC.

Re:Really simple math (1)

Garabito (720521) | more than 8 years ago | (#14985304)

The point of TFA was not against faster hardware in general, it was about dual core CPUs in particular. Chances are that those 12 sec/min you spend waiting are due to lack of RAM, a slow hard drive, memory or system bus; usually the processor is the most wasted resource in a typical desktop PC, because most of the time it stays idle, waiting for I/O bounded programs. A faster CPU, leaving everything else equal, won't improve performance that much, neither would do a dual core CPU.

why don't you... (0)

Anonymous Coward | more than 8 years ago | (#14985316)

...treat yourself to a gig of Ram for your work machine then? I mean, it's cheap!

I'm a blue collar worker, the boss has certain major tools, but I also buy my own, a lot of them, and replacement parts and whatnot, and it's certainly more per month than what a stick of RAM costs today. A month, not a year, a month. And that's at my low bucks budget, not a white collar IT office workers budget. You got the money, spend a few dollars. I mean, geez loweez... If it would make you work better, and not make you waste time waiting for the computer to do something, just *do it* and be done with it and don't wait for iceberg management to poop out a few bucks for you. It's better than going nuts (you posted about it so it must be bugging you bad),so go ahead and get the tools you need!

Mr. Peabody step into the wayback machine. (1)

Stumbles (602007) | more than 8 years ago | (#14985125)

Oh please. TFA is just another knucklehead pontificating and rehashing the same old tired argument that has gone on for oh I don't know twenty years or better. I can no longer count how many times this same old argument has been made every time a new technology comes out. It's the same bull crap nonsense that started way back when 8K of core (you remember that right?) memory was deemed to be plenty. And I'm sure it goes even further back in time.

Memory bound, not CPU bound ... (5, Informative)

gstoddart (321705) | more than 8 years ago | (#14985130)

In my experience, and I'm a software developer so take that with a grain of salt, the vast majority of people will get more performance from more memory than more CPU speed.

I'm almost never CPU bound if I have enough memory. If I don't have enough memory, I get to watch the maching thrash, and it crawls to a halt. But then I'm I/O bound on my hard-drive.

Dual-CPU/dual-core machines might be useful for scientific applications, graphics, and other things which legitimately require processor speed. But for Word, IM, e-mail, a browser, and whatever else most business users are doing? Not a chance.

Like I said, in my experience, if most people would buy machines with obscene amounts of RAM, and not really worry about their raw CPU speed, they would get far more longeivity out of their machines.

There just aren't that many tasks for which you meaningfully need faster than even the slowest modern CPUs. If you're doing them, you probably know it; go ahead, buy the big-dog.

Repeat after me ... huge whacking gobs of RAM solve more problems than raw compute power. Always has.

Not Now, but a swell idea if you plan to run VISTA (2, Insightful)

pcguru19 (33878) | more than 8 years ago | (#14985132)

The big ole bag of ass that will become Vista someday is going to make good use of that 2nd core. The current preview version loves all the CPU, RAM, and Video processing you can throw at it.

Where I work, we're starting to use VMWare or VirtualPC to isolate troublesome apps so one crappy application doesn't kill a client's PC. Virtualization on the desktop will expand to get around the universal truth that while you can install any windows application on a clean windows OS and make it run, installing apps two and beyond aren't guaranteed to work together. Between virtualization and Vista, it's wise for business customers to OVERBUY for today so it's usable in 3-4 years.

Don't you mean... (1)

Trogre (513942) | more than 8 years ago | (#14985135)

... from the "well-duh" department.

Re:Don't you mean... (1)

s_p_oneil (795792) | more than 8 years ago | (#14985285)

Exactly! I was going to make a "well-duh" post, but I guess it's pointless now. As a software developer, I can honestly say that only the software developers at my company need that extra horse-power. Unless they're trying to play the latest games on their work computers. Did the article mention whether people need better graphics/sound cards for their work computers? ;-)

Unless you run a secure OS (0)

Anonymous Coward | more than 8 years ago | (#14985155)

Even quad-core also may not be enough if your computer is busy on crunching spyware, adware, bots and sending spam to all over the world.

The more cores you have, it is pretty important to run a secure multimedia OS such as Tomahawk Desktop [tomahawkcomputers.com] .

VoIP, High Definition audio and video, complex Office Suites, 3D demos, etc. may very soon demand more than dual-core.

Since when.... (2, Insightful)

countach (534280) | more than 8 years ago | (#14985170)

Since when was "legitimate business purposes" part of the equation? Many business users just using office and email could use a 5 year old PC. But the industry moves on. Lease agreements terminate. The upgrade cycle continues its relentless march. Smart businesses could slow their upgrades down. Typcial businesses will keep paying Dell and keeping HP's business model afloat.

Obligatory Quotes: (2, Interesting)

absurdist (758409) | more than 8 years ago | (#14985173)

"640 KB should be enough for anybody."
-Bill Gates, Microsoft

"There is no reason why anyone would want a computer in their home."
-Ken Olsen, DEC

One reason: better user experience (3, Informative)

mfifer (660491) | more than 8 years ago | (#14985175)

in our financial world, users often have several spreadsheets open (deeply linked to other spreadsheets), Bloomberg, Outlook, several instances of IE, antivirus software and antispyware software running in the background... you get the idea.

the more memory and horsepower I can provide them, the better experience they have with their machines. and empirically it seems that underpowered machines crash more; they sure generate more support calls (app X is slooowwww!!!)

same goes for gigabit to the desktop; loading and saving files is quicker and those aforementioned linked spreadsheets also benefit from the big pipes...

IF one can afford it, and the load is heavy as is our case, every bit of power one can get helps...

-=- mf

I'm a developer/business user/ and gamer (3, Interesting)

ikarys (865465) | more than 8 years ago | (#14985177)

I will benefit from multi-core.

I'm perhaps not a typical business user, but what business wants is more concurrent apps, and more stability. Less hinderance from the computer, and more businessing :)

Currently, I have a Hyperthreaded processor at both home and work. This has made my machine immune to some browser memory leak vulnerabilities, whereby only one of the threads has hit 50% CPU. (Remember just recently there was a leak to open windows calc through IE? I could only replicate this on the single core chips).

Of course hyper threading is apparently all "marketting guff", but the basic principles are the same.

I've found that system lockups are less frequent, and a single application hogging a "thread" does not impact my multitasking as much. I quite often have 30 odd windows open.. perhaps 4 word docs, outlook, several IEs, several firefoxs, perhaps an opera or a few VNC sessions and several visual studios.

On my old single thread CPU this would cause all sorts of havock, and I would have to terminate processes through task manager and pray that my system would be usable without a reboot. This is much less frequent on HT.

With muli-core, I can forsee the benefits of HT with added benefits of actually being 2 cores as opposed to pseudo 2 cores.

For games, optimised code should be able to actively run over both cores. This may not be so good for multi tasking, but should mean that system slowdown in games is reduced as different CPU intensive tasks can be split over the cores, and not interfere with each other.

(I reserve the right to be talking out of my ass... I'm really tired)

Unbelievable (3, Insightful)

svunt (916464) | more than 8 years ago | (#14985201)

I really can't believe this debate is ongoing. It's really the same thing, as has been pointed out above, as any "I don't need it this week, so it's just not important, period" argument, which can be traced back some decades now. For some of us, it's worth the early adopter price, for the rest, it's worth waiting until it's a much cheaper option, but as we all should know by now, what Gateway giveth, Gates taketh away. As the new hardware becomes available, software developers will take advantage of it. The only quetion is - how long can you hold out while the price comes down. It'll be a different answer for all of us. There is no definable "business user" to make such generalisations about accurately.

For Pros but maybe not so much for business use (0)

Anonymous Coward | more than 8 years ago | (#14985226)

Most software actually doesn't support Dual core. Even Maya only supports it in rendering. The primary uses are multiple applications running as others have mentioned and rendering in 3D and 2D apps that take advantage of multiple processors. Most people will see no increase but if they open multiple applications they should be more stable and run faster. I'd say it affects professional users more but not nessaccarilly business users. If that makes since. You might see benefits if you have multiple large spread sheets open at once but most business users aren't going to see much difference. For graphics they are a real boon. If you have a quad board running dual core chips you have essentially a 8 node renderer. Not a true eight fold increase but still drastically faster than a standard quad system let alone a single processor. Personally I can't wait for the quad core chips but they aren't expected until next year. Eventually games will take advantage of the extra rendering power. By the time that happens there'll probably be quad core chips so the speed increases could be substantial.

Obligatory Patches O'Hoolihan quote (0)

Anonymous Coward | more than 8 years ago | (#14985245)

Necessary? Is it necessary for me to drink my own urine? No. But I do it anyway, because it's sterile and I like the taste...

Multi-core will be the new ePenis (1)

MrNougat (927651) | more than 8 years ago | (#14985264)

The CEO will insist on having a 8000GHZ, 256-core machine with 12TB RAM and infinity-plus-one hard drive, so he can feel more important.

Even though all he uses for work are Outlook and Word, neither of them well, and installs every ActiveX control that promises free porn.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...