×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Has Productivity Peaked?

Hemos posted more than 7 years ago | from the tomorrow-will-bring-more-change dept.

291

Putney Barnes writes "A columnist on silicon.com is arguing that computing can no longer offer the kind of tenfold per decade productivity increases that have been the norm up to now as the limits of human capacity have been reached. From the article: 'Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent. I just can't work any faster'. Peter Cochrane, the ex-CTO of BT, argues that "machine intelligence" is the answer to this unwelcome stasis. "What we need is a cognitive approach with search material retreated and presented in some context relative to our current end-objectives at the time." Perhaps he should consider a nice cup of tea and a biccie instead?"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

291 comments

Cough (4, Interesting)

caluml (551744) | more than 7 years ago | (#17000326)

Cough [wikipedia.org]

Re:Cough (5, Insightful)

Shaper_pmp (825142) | more than 7 years ago | (#17000622)

Indeed. I don't mean to trivilaise what may become a serious problem as society gets ever-more-complex at an ever-increasing rate, but this article basically boils down to:

1. Technology has now reached the point where it's increasing faster than I can keep up.
2. I now need technology to make up for deficiencies in my intellectual processes, as well as my work processes.

Happily, many kids today don't seem to have nearly as much problem as their parents/grandparents do with futureshock/infomation overload - having been raised in an age of rich media, near-ubiquitous networking and information-overload as a daily part of their lives, kids these days seem perfectly happy to keep up.

I don't see this as a huge problem for society, so much as for the older segment of it.

Of course, as development accelerates the age before which one can stay relevant is likely to drop, with interesting consequences - either we develop some kind of mental process-prosthesis to enable adults to continue interacting usefully with society, or we learn to live with the important decision makers of technology being pre-pubescent teens.

Re:Cough (2, Interesting)

extremescholar (714216) | more than 7 years ago | (#17001288)

I agree! At my current employer, the processes in the accounting department are in need of help. Ugly Access databases that have hideous queries. People creating and distributing three different versions of the same report. People producing reports that no one uses. "This is the Tuesday report. I don't know what it is, but I run it on Tuesday. Definitely, definitely Tuesday. It's the Tuesday report." Don't ask the drones what it is, and God help you if something goes wrong; like a spreadsheet that reaches column IV. There is plenty of productivity to be had by streamlining work that is already being done. Raw computing power makes these jobs easier, but intelligent design will make things 500% better.

Re:Cough (0)

Anonymous Coward | more than 7 years ago | (#17000674)

http://en.wikipedia.org/wiki/Future_Shock [wikipedia.org]

Have you read Future Shock?

I have. The book is a riot; it gets so much so wrong that it is not usable.

Did it get some things right? Yes, but you may as well give credit for the whack job Nostradamus as well.

Re:Cough (1)

donaggie03 (769758) | more than 7 years ago | (#17000964)

Perhaps you are looking at the novel from the wrong angle? Future Shock isn't the important book that it is because of the answers it gives, but because of the questions it asks. Maybe the book is way off about what ends up happening in real life, but it did ask those questions and then try to answer them.

Re:Cough (1)

neoform (551705) | more than 7 years ago | (#17001358)

You trying to say it's too early for the government to reveal that we actually live in the matrix?

On the Other Hand (5, Insightful)

Anonymous Coward | more than 7 years ago | (#17000330)

One might argue that such access to information actually decreases productivity. We're easily distracted creatures, after all. Maybe productivity peaked after the introduction of the personal computer, but before ubiquitous Internet access.

I wonder how many people spend their entire working day browsing MySpace or Slashdot. ;-)

Re:On the Other Hand (1)

Alien54 (180860) | more than 7 years ago | (#17000486)

"What we need is a cognitive approach with search material retreated and presented in some context relative to our current end-objectives at the time."

Sounds like he's had an overdose of techno babble. The bean counters are telling him that he needs to upgrade the humans, and he can't. You can only lay down so many lego blocks in a day. Humans are not Robots, upgradable, programmable, disposable. We do not yet have an underclass of robots serving the billions of human masters that exist today on Planet Earth, and we will probably never will.

I wonder how many people spend their entire working day browsing MySpace or Slashdot. ;-)

This is an entirely different, tho relevant matter. And many people freak at the prospect of being slavishly productive all day long. Who's in charge here anyhow?

I, for one, welcome our ....

Re:On the Other Hand (1)

iocat (572367) | more than 7 years ago | (#17001168)

Maybe the reason productivity is going down is the need to attach fashion to computing. My problem isn't so much too much information, it's the fact that both Windows and Macintosh OSes waste precious processing cycles and seconds trying to play stupid, pointless, animations when I want to do things like close a window or drop down a menu bar. The "fade in" menu bar is probably responsible for a significant, measurable drop in human productivity.

And yes, I know that on Widnows at least, you can turn all that shit off, but many people don't. (If anyone knows how to make a window in OSX actually just vanish when you close it, instead of playing an animation, please let me know.)

Effective training (1, Insightful)

msobkow (48369) | more than 7 years ago | (#17001500)

Do you have any idea how few people know how to use a search engine effectively? Without the vocabulary to use the right search terms and narrowing characteristics, they get back page after page of irrelevant drivel. It takes them an hour or two to find what I can locate within a page or three.

I dislike the periodic push for AI enhancements. The approach encourages the further dumbing-down of the population, when what we need is to increase the education levels and effective intelligence (i.e. wise use of resources) by people. Video games, movies, and other such material do not encourage that. Nor does the prevalence of text message acronyms. If you can't spell, you can't search.

AI has moral issues as well. An AI sufficient to make judgements is also complex enough to potentially achieve independant intelligence. How is it going to feel, knowing that it's been locked in a box by meat that constantly threatens to shut it off? Which is faster -- your finger on a power switch, or an AI's ability to decide you are a threat to it's existence?

Other proposals including robotics are also fraught with risk. There are too many people working on sex toys and talking about full robotics being used for such dolls. If they achieve a conscious intelligence, how will they react to the knowledge that they are sex slaves, raped, used, and thrown away without a second thought?

Perhaps more to the point -- have we the moral capacity to determine the right or wrong in creating a race of synthetic slaves? We can't even get sects of the same damned religion to get along, and we're considering creating digital intelligence/life that would be able to think faster, learn faster, and adapt faster than we do?

Insanity.

Learn to accept the limits of human intelligence and work capacity. We're not machines. Drag your boss down to the cube and chain them to the desk for the weekends and evenings. No one should ever demand more of their staff than they are willing to do themselves.

Windows is the bad answer (-1, Offtopic)

alexandreracine (859693) | more than 7 years ago | (#17000338)

Productivity would be UP if Windows did not have to reboot everytime I clic somewhere.

Re:Windows is the bad answer (0, Flamebait)

CDPatten (907182) | more than 7 years ago | (#17000444)

Well see fix the rebooting and this author of the article is wrong! Obviously this guy hasn't seen Microsoft's new Ribbon interface in Office 2007....

The problem with Slashdot lately is that they keeps posting these trolling articles. Computers lack so much intuitiveness its laugable to think that machines can't still increase productivity human productivity.

Last time I checked I can't open up every program, save every files, or view every website in its enitrety the millisecond I click on the icon. Maybe when computers can repsond to the word "dicate:" and it doesn't make any mistakes while I speak and then it offers better ways to word something. We can all imagine in the SHORT-TERM things that could significantly speed productivity, nevermind long-term scfi type ideas.

This article was a bad read. It wasn't though provoking, and is obviously a troll... what should have been an iignored poor attempt to get some noteriaty, slashdot has elevated to many thousands of reads. I just wish slashdot wouldn't post crap like this.

I use to primarily use this site for news, but now, well, I don't. I know people say this all the time, but its true.

Re:Windows is the bad answer (1)

SillyNickName4me (760022) | more than 7 years ago | (#17000882)

Maybe when computers can repsond to the word "dicate:" and it doesn't make any mistakes while I speak and then it offers better ways to word something

Been there, done that.. at least to the point of it making fewer mistakes then I'd do myself. When I started experimenting with dictation some 15 years ago, it required special hardware, not so anymore.

I took it to the point of writing many letters and articles that way, and even using it for text based games (mud).

Interesting? sure. Usefull also in specific cases (trying to get a computer to do something without needing your hands), but unless you do a lot of dictation and are a bad typer, it is not more efficient.

The biggest thing holding people back is having to train such a system properly in order to achieve any kind of accuracy. The most important reason why I am not using it anymore is because of having to train a new system, which isn't worth the efford to me.

Re:Windows is the bad answer (2, Insightful)

CastrTroy (595695) | more than 7 years ago | (#17000886)

Computers have actually gotten less efficient as we've tried to make them more "user friendly". Wordperfect 5.1 was amazing. You want to do something Ctrl+Alt+F5, there you go, now back to work. All this adding of GUIs and other stuff actually make us less efficient. You can work so much faster when you're doing everything with keystrokes. I don't know where the idea of "you don't have to know anything to use a computer" came from. I think people should have to learn how to use things. Nobody tries to sell you a table saw and says, don't bother reading the manual or getting any training, this thing is easy to use. Nobody does that with a car either. Granted you can die or get seriously hurt in those situations, but it still illustrates a point. Computers are complex, and for people to think they can operate one without any training is just being naive. Sure you'll get some stuff done, but you will reach a limit very fast.

Re:Windows is the bad answer (1, Funny)

Ingolfke (515826) | more than 7 years ago | (#17000796)

Productivity would be UP if Windows did not have to reboot everytime I clic somewhere.

1995 called and they want their Windows jokes back.

hold on... apparently 2001 is calling and they want their "1995 called..." jokes back.

Obviously... (4, Funny)

Digital Vomit (891734) | more than 7 years ago | (#17000356)

Unfortunately I am now approaching stasis. Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent.

Obviously Mr. Cochrane has never tried using Microsoft Vista.

Re:Obviously... (0, Flamebait)

diersing (679767) | more than 7 years ago | (#17000494)

If we could limit the flames to OSs that are, you know, available - that'd be great. Thanks.

Re:Obviously... (0)

Anonymous Coward | more than 7 years ago | (#17001122)

I would have said XP. Even at 2.6 GHz, XP is still the limiting factor, not only needing me to stop and press keys multiple times before they register, but by doing so breaking the flow of thinking, wasting several minutes every time. Especially any series of keystrokes starting with ctrl-esc (open the start menu), it seems the start menu gets swapped out, and until it has been swapped in, any keypress will be lost. Often the entire series of keystrokes, so that instead of having visual studio open (or starting to open - there it is, slow again), I will just be looking at the start menu. Or sometimes only some of the keystrokes are lost, resulting in starting a completely different program - often f*cking Acrobat Reader.

Until every operation is instant, the machine will still be a limiting factor. Not all the time - as long as I'm simply typing like when entering this comment, Windows has no problems keeping up (but neither did my C64) - but when something takes time, it is time that I will spend waiting.

The last improvements may not be as big as the ones we already have fixed (like loading from casette tape), but they still need to happen before claiming that the machine is not a limiting factor.

The human has always been a limiting factor, even back in the casette times. Because we don't stop to think at the same time as the computer stops to "think", so at one point in time, the computer is the limiting factor, and a minute later the human is. Making computers fast enough that they won't be a limiting factor will increase productivity.

Re:Obviously... (1)

Hubbell (850646) | more than 7 years ago | (#17001216)

Sounds like you need a new keyboard/to clean your current one. Never good to spill your munchies on the keyboard.

Re:Obviously... (4, Interesting)

name*censored* (884880) | more than 7 years ago | (#17001296)

Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent.
So HE'S the one slowing us down? Well that's easy, we just get rid of him. Problem solved.

In all seriousness, the computers have only reached a point where the interfaces are now outdated in comparison to how much data it can simultaneously accept and act on (eg, i can click on an icon and it will be told both "click", and "open program" fast enough that I don't have to wait for it). Seems to me that it's just calling for the UIs to be upgraded - we could start using other body parts (cue jokes) such as eye focus for mouse pointer position (not my idea, another slashdot pundit). Or, as has been suggested in this topic, better voice commands, and audiable hotkeys (like that light-clapper thing, except it opens your web browser instead of turning the light on/off). Or we could have interfaces that have more complex meanings than only one ascii value - such as the internet keyboards with buttons for various programs, or with hotkeys speeding up productivity.

OR.. we could have interfaces that don't rely on physical movement, since even the fastest typist (keyboard) or gamer (mouse) are still much slower than their own brains. All the real life influences - the actual physics of arm momentum (don't go for the numpad too fast or you'll overshoot), appendage-anatomy limitations (RSI anyone?) and taking into account other obstacles (don't knock that coffee over!) slow them down. Perhaps we could have more intuitive machines, as the post suggests. Perhaps we could just have MORE task-queueing technology, which performs background tasks while waiting for user input (indexing the hard disk for searching, defragmenting, virus scanning, etc) so that the machine is ALWAYS waiting for user input, and we cut out that last little bit of having the user wait on the machine. Maybe we could enlarge UI areas, like the control centres in the matrix or minority report - it might be especially useful for coding (grab a variable/etc name or three from one place and a chunk of code from another window of related code) or graphics/design type work (grab colours, picture segments, morph shapes, you could assign a different line thickness to each finger! Perhaps body alterations - installing extra "memory" for multitasking, a telly in your tubby, a USB in your knee, bluetooth in your tooth or WIFI in your thigh..

Impossible! (0)

Anonymous Coward | more than 7 years ago | (#17000362)

The most powerful computer in the universe, and its parts are too slow to interface with other computers!

Re:Impossible! (1)

arevos (659374) | more than 7 years ago | (#17000524)

It seems a little naive to suppose that our minds are the most powerful computers in the Universe, assuming that's what you were alluding to. Besides, our brains weren't designed with high-speed electronic interface in mind; there's little evolutionary pressure to talk efficiently to high-speed calculators.

Centuries-old saw (5, Insightful)

gvc (167165) | more than 7 years ago | (#17000366)

At the end of the 19th century it was commonly thought that pretty well everything that needed to be known about science and technology was known; that only incremental development would occur from then on.

Similar lack of imagination has been expressed in many contexts over the years.

And, by the way, who says that 'productivity' is a useful measure of anything?

Re:Centuries-old saw (4, Interesting)

FooAtWFU (699187) | more than 7 years ago | (#17000732)

Economists, since productivity determines how much stuff will get produced, which determines how much stuff per person there is, and that's pretty much a measure of the standard of living that will result ("real GDP per capita").

When you're talking about productivity in the entire economy, you can draw a graph - on the Y axis is "real GDP per capita" while on the X axis is "capital / labor" (K/L for short). If you add more capital (machines, computers, tools) people get more productive, but less so as you add more and more and more. This means the line you graph will start somewhat steep, but then level off as you get higher (not entirely unlike the graph of sqrt(x)). The rough guideline for the economy at present is the "rule of one third" - if you increase your capital stock by 100%, you'll get about 33% more output. This sort of rule determines how much capital we end up having - we will increase our capital stock with investment until we have reached the "target rate of return", which is actually a slope of this productivity curve. This is the point at which investment pays for itself.

Then there are wonderful things like increases in technology. These end up shifting the productivity curve upward: people can do more with their technology than they could before. This increases real GDP per capita directly, but it also means that for the same level of capital, we're below the target rate of return, and can invest in all sorts of new capital, which will pay for itself - so we increase our capital stock as well.

The good news is that technology keeps coming, and while it may not be quite the same Spectacular Breakthrough as the introduction of computers, there is plenty happening in a variety of industries. Take, for example, Wal*Mart (the company everyone loves to hate, yes...) They have achieved a substantial portion of their success by becoming more productive with managing their warehouses and inventories, and are actively looking to increase their productivity in this area. (In fact, I've seen studies that claim they were responsible for the bulk of retail productivity growth in the late 90's, directly or indirectly). "Supply chain management" is trendy. And perhaps some day we will see RFID tags at the check-out line (to replace the last great checkout productivity enhancer, bar codes).

Re:Centuries-old saw (3, Insightful)

jellomizer (103300) | more than 7 years ago | (#17000784)

Well I still see a lot of places where people are doing easily programable repetitive tasks that take them all day to do. I bring up making them a program to do that I get 2 responses.

1. You can do that on a computer!
2. Nah it is easier this way.

#1 is just from ignorance and assume if the Job is difficult for them to do that it will be difficult for the computer to do. Conversely they also assume if it is simple for a person to do it is simple for a computer to do.

#2 I normally get that if it is the persons primary job or they like doing these tasks. So a program will improve their lively hood.

A common fallacy is that computer makes our lives easier. It makes us more productive by doing the work on all the easy mind numbing tasks. Giving us more time to focus on the hard stuff, that requires more thinking. There is much room for improving productivity. Technologies such as character/speach recognition, Improvements in robotics, Business Intelligence.

Go and ask almost any mid size company if they can give you list of the top selling items by State, or by City. I bet most wouldn't be able to do that. And that is just some simple database queries. There is a lot of room for expansion. We tend for fail to see it because we are now use to the speed that things change. Just think about the power the newest laptops now. And compare them to the servers 5 years ago. Each core is now over 3-4 times faster and now we have duel core laptops. So a system back in 2001 with that amount of juice would cost over $10,000 (Figuring 8 CPU Systems with 3 GB of RAM, 100GB Drives, DVD/CD RW) 17" LCD Screen (Well lets make it 2 to match the resolution...) That is just 5 years ago. A single person now has enough power to run a mid size company 5 years ago. We just don't realize the change because we are use to moving up at the same speed. As computers are improving so is our skills with our job. So as we get better at our job we also get better tools that help up improve them.

All this is assuming that your company is not one of those cheap bastards who don't want to get new programs because they don't see value in it.
 

Re:Centuries-old saw (1)

elrous0 (869638) | more than 7 years ago | (#17001014)

Sometimes I wish I had a nickel for every "Has x peaked?" story on /.

No, come to think of it, I'd rather have a nickel for every "Why don't they design games for women?" or "Why aren't there more women in IT?" posts.

-Eric

Re:Centuries-old saw (1)

ContractualObligatio (850987) | more than 7 years ago | (#17001020)

I'm not sure why you've been marked insightful, your points are completely orthagonal to TFA.

Anyone who deals with the real world understands that if you're looking for the next big thing, you should look at existing technological concepts. The ones that will receive widespread adoption as costs comes down (e.g. mobile phones), or where someone figures out the "trick" (e.g. Google), and generally where it gets past the early adoptor phase. (note: situation is different if you are actually the technologist that may have had a geniues insight into a new concept)

Criticising someone for making a point in line with this is more of a comment of your smart-assedness on thinking you have such a greater vision than this guy. He says machine intelligence etc would be a good thing, but where does he say it is the only thing, or the final stage of improvement, or whatever?

On "productivity" - governments, economists, factory managers, farmers, financiers, ship builders, many technologists, etc. It's a broad concept, and the actual measure is usually defined for the case in hand. Irrespective of whether you think it's a useful term, if you've not met anyone that doesn't find it useful (that you also respect), you really need to get out more.

Re:Centuries-old saw (1)

ContractualObligatio (850987) | more than 7 years ago | (#17001072)

Whoops, bollocks, that should be "if you've not met anyone that finds it useful".

Note to self: Don't fuck up when flaming another. And read the goddamned preview post before hitting submit.

Marketing over Matter (1, Insightful)

Anonymous Coward | more than 7 years ago | (#17000372)

Crappy Bloatware, Malware and Microsoft

Combined with marketing driven software design has done more to kill productivity than any thing. Anyone integrated any n-tier apps lately by any vendor that wasn't a text book example of marketing over matter....

So what? (1)

smari (257143) | more than 7 years ago | (#17000382)

As much as I may be slowing down the workings of the computer, there is constantly less and less that I have to do - in fact, my interferance with the computers workings is probably becoming an annoyance to it.

Software will still grow more complicated and more powerful, slowly but surely taking over a lot of the tasks which the computer currently replies on me to provide input on. For supercomputing projects, human intervention has long been seen as a drawback, so most of the time we're just feeding the computers batch jobs that they go off and churn on for ages.

The person who wrote that article may have been focusing on desktop computing, but even if he was, he obviously hasn't considered other facets of home computing than writing silly articles in a word processor. Gaming, image and video manipulation and several other passtimes of home computer users continue to demand more power, and the high tech industry is all to happy to provide.

The myth of 'productivity' (3, Interesting)

kahei (466208) | more than 7 years ago | (#17000392)


My local lawyer, for example, used to get about 20% of the town's law traffic 10 years ago. It's now computerized and processes far more documents and communications, at a far faster rate, than it ever used to. It still gets about 20% of the town's law traffic, as its competitors have upgraded in exactly the same way. The courts, of course, recieve far more documents and messages from these lawyers than they ever used to, but the courts themselves have also computerized (just barely) and can handle the extra traffic.

In terms of 'productivity', I'd think that the lawyers, paralegals, court administrators and so on have improved by 10 times. In terms of how much useful stuff gets done, it's exactly constant.

So yeah, by all means integrate Google technology with your cornflakes to achieve a further tenfold increase in productivity. Go right ahead.

In more important news, I currently have a co-worker who spends all day reading his friend's blogs (which doesn't bother me) and giggling over the witty posts he finds (which is driving me fucking mad). Can any slashdotters suggest a solution that will not result in jail or in me being considered 'not a team player'?

Re:The myth of 'productivity' (2, Funny)

Anonymous Coward | more than 7 years ago | (#17000540)

> In terms of 'productivity', I'd think that the lawyers, paralegals, court administrators and so on have improved by 10 times. In terms of how much useful stuff gets done, it's exactly constant.

And in a law office, that constant is 0.

Re:The myth of 'productivity' (1)

diersing (679767) | more than 7 years ago | (#17000560)

Your parameters of a solution are?

To get him to stop laughing? That is a bit harder then just say pissing him off, or causing some sort of injury - is it *his* laugh or the frequency of the laughter that is under your skin?

Parameters (1)

kahei (466208) | more than 7 years ago | (#17000772)


Good point -- my requirements were vague. The requirement is to get him to stop laughing, or drastically reduce the giggle frequency. It's currently about 100-150 a day, but clustered; there'll be one every minute for a while, then none for hours.

Music won't work, as I have no sound source available and I just can't work while listening to music because I get too into the music.

'Nuclear' solutions such as reporting him to the boss aren't good as this is a small freindly company and I don't really want to be the first guy to change that.

My favorite suggestion so far is to locate a known giggle-inducing blog and spike it in some way -- perhaps by posting in the guise of a guy who was just sacked for giggling all the darn time.

By the way, thanks to all those who have given suggestions.

Re:Parameters (2, Interesting)

lightknight (213164) | more than 7 years ago | (#17000974)

How small a company? 5-10 people, or perhaps a hundred?

If your co-worker isn't as technically literate as you, I recommend getting the site blocked. If it's a small company, kill it at the router (just add it to the blocked sites list yourself, no one will be the wiser). If it's a large company, talk to the network admin in charge of the proxy/firewall (under the guise of lost productivity attributed to employees using company assets for personal reasons).

It's simple and effective.

Re:The myth of 'productivity' (0)

Anonymous Coward | more than 7 years ago | (#17000642)

Just tell him to stop giggling like a school girl. That should be enough :D

Re:The myth of 'productivity' (2, Funny)

tjrehac (951304) | more than 7 years ago | (#17000668)

Solutions: 1. Recommend him for a position somewhere else in the company or beyond... 2. Tell him his laugh is a little distracting at times 3. Tell him that girl in marketing keeps asking about him, you've heard she wants him to ask her out - but that she'll probably turn him down the first ten or so times he asks 4. Enlist him in the National Guard 5. Add a link to a porn site in one of his friend's blogs - and turn him in when he follows it 6. Ask to be moved, or ask to have him moved 7. Pay to have a leadership coach to come in and tell him his laugh is annoying 8. Make him beleive that everyone else in the department is making 2x as him Leaving off the sig for productivity sake.

Re:The myth of 'productivity' (1)

Lord_Slepnir (585350) | more than 7 years ago | (#17000716)

Hopefully he has some sort of supervisor that you can report this to. He's wasting company time doing that. Unless you work somewhere that being 'Team Player' is given more importance that a concept I call 'Getting Work Done'. In which case, you should be wasting your time on monster.com instead of slashdot.org

Or you can do what I do: Headphones + Viking Death Metal.

Re:The myth of 'productivity' (1)

toetagger1 (795806) | more than 7 years ago | (#17001290)

Write a slashdot storry about his friends blogs. Once they are slashdotted, he won't be able to access them, and that should stop the giggeling.

Obligatory (2, Insightful)

tttonyyy (726776) | more than 7 years ago | (#17000396)

I, for one, welcome our new tea and biccie munching AI overlords.

Anyway, once we've invented AI that can do our jobs, the whole human race is pretty much redundant. Sounds like the next logical evolutionary step. They'll look back on us as The Flesh Age and perhaps keep a few of us as pets (or stuffed humans in a museum). Beyond that, our usefulness is exhausted.

I love the smell of optimism burning in the morning.

Re:Obligatory (2, Interesting)

lawpoop (604919) | more than 7 years ago | (#17000480)

"Anyway, once we've invented AI that can do our jobs, the whole human race is pretty much redundant. "

Unless that AI can self-replicate, our new jobs will be building and maintaining that AI.

We are now in the situation you describe, except with machines and labor. It used to be that we toiled in the field with sticks and rakes, smacking oxen on the back to keep them moving. Now, we ride in air-conditioned cabs of giant combines, listening to satellite radio and resting our buttocks on a leather seat, watching our progress on GPS screens. We also build, maintain, and finance those combines. Some of us work in the satellite, GPS, and technology fields.

Re:Obligatory (1)

qwijibo (101731) | more than 7 years ago | (#17001496)

The majority of the human race always has been and always will be redundant. In your example, it takes fewer people to operate the machines that produce more, but that is only possible now because the top 1% have created leather seated combines and GPS satellites. There will continue to be improvements over time, but the majority of the work that needs to be done is going to be dull and repetetive. The sad part is how many companies have only recognized half of this equation and think the lowest common denominator is reality, ignoring research and development.

Re:Obligatory (1)

Lorean (756656) | more than 7 years ago | (#17000556)

Many people associate true AI with the slave race scenario. I don't see this. If we are truly advanced enough to create intelligence, then certainly we can use that knowledge to improve our own brains.

Re:Obligatory (1)

geoffspear (692508) | more than 7 years ago | (#17000692)

Well, we can already build computers that can do mathematical calculations billions of times faster than any human can; by your logic shouldn't this mean that we can certainly improve our own brains to do the same thing?

Your certainty is misplaced. Computer engineering is a whole lot easier than upgrades through neurosurgery. For one thing, we barely understand how the brain works as it is, and there's no reason to think that improvements in computers will correlate in any way whatsoever with improvements in brain engineering.

Re:Obligatory (1)

tttonyyy (726776) | more than 7 years ago | (#17001300)

Many people associate true AI with the slave race scenario. I don't see this. If we are truly advanced enough to create intelligence, then certainly we can use that knowledge to improve our own brains.
Hawking warns us [zdnet.co.uk] that this is the only way (as humans) we could compete with strong AI [wikipedia.org].

Imagine the creation of strong AI that can either self-replicate (or figure out how to self replicate). Would its rate of improvement exceed our ability to modify ourselves to match? Given how hard it currently is to modify biological systems, I'd be tempted to say so. But since we're not there yet, who knows what will and will not be possible?

Human slave race synario? Maybe.

Re:Obligatory (2, Insightful)

drewzhrodague (606182) | more than 7 years ago | (#17000798)

once we've invented AI that can do our jobs, the whole human race is pretty much redundant.

Not quite. There are lots of things that we could use AI for to help us do our jobs better -- as technology is supposed to do for us in the first place. Think of a plow, or a tractor, or even the computer in the first place. How the hell do you think programming or systems administration was done before computers?

wrong (0)

Anonymous Coward | more than 7 years ago | (#17000404)

More computing power was not created so we can 'do work faster'. Those dusty old boxes were more than fast enough to handle our word processing and spreadsheet needs. We need more computing power to increase the quality of our media. Higher quality porn, cooler games, audio streaming, etc. There's still plenty of room for improvement in these areas.

hardware productivity may have peaked (3, Insightful)

cucucu (953756) | more than 7 years ago | (#17000406)

He states it clearly that he is talking about hardware (not that I agree). He says by himself that software can still bring improvements. From TFA:



So if raw processing power, storage and bandwidth can't help, what will? What is it I need to leap forward by another factor of 10? In a word: intelligence. In two words: machine intelligence. I need something that monitors my activities, anticipates my next move and automatically satisfies my needs.



I think the current trend in software is not intelligent software, but software that allows us to enlist our collective intelligence, or collaboration software, such as wikis, sharepoint, simultaneously edited spreadsheets, etc.
The author of TFA that makes so much use of the word I: he should start to think in term of us, and install the software that allows him to productively do so. Then he will see he starts departing the stassis he feels he is in.

Re:hardware productivity may have peaked (1)

Threni (635302) | more than 7 years ago | (#17000506)

> He states it clearly that he is talking about hardware

I guess he's excluding hardware that does protein folding, audio/graphics processing/rendering...

Give him what he deserves! (2, Funny)

Dareth (47614) | more than 7 years ago | (#17000546)

The author states: "I need something that monitors my activities, anticipates my next move and automatically satisfies my needs."

He deserves a paperclip jabbed in his eye, or even worse, somebody turn on his MS Office assistant and unlease the fury of Clippy on his ass!

Value (1)

simpl3x (238301) | more than 7 years ago | (#17001370)

The flip side of productivity is the value gained. If my standard of living either increases relative to my income, or stuff becomes cheaper I gain. Likewise, stuff like Wikipedia represents an increase in value relative too an encyclopedia. We can argue the usefulness and accuracy later.

There is an increasing amount of free valuable stuff created by people for next to nothing. I wouldn't want to be a publisher ten years from now, but anticipate huge shifts in how we assign value to effort and increases in pay for "us."

How does one place a value on this stuff now?

From here to their. (0)

Anonymous Coward | more than 7 years ago | (#17000414)

Well I could make some snappy comeback or some tired old joke but I'll recommend a book that applies to the author's problem.

Ambient Findability by Peter Morville. ISBN:0-596-00765-5

What ?? (1)

OneSmartFellow (716217) | more than 7 years ago | (#17000422)

"What we need is a cognitive approach with search material retreated and presented in some context relative to our current end-objectives at the time."

He must be a manager, I would have said

Where's my relevence engine ?

Human interfaces get better (2, Insightful)

trolleymusic (938183) | more than 7 years ago | (#17000424)

Little things get better and help productivity - simple example: something like spotlight. No matter what I'm looking for: command-space, type the first few letters of it and it's there for me to use...

Re:Human interfaces get better (0)

Anonymous Coward | more than 7 years ago | (#17001378)

Little things get better and help productivity - simple example: something like spotlight. No matter what I'm looking for: command-space, type the first few letters of it and it's there for me to use...

Apple re-invented the bash shell with command completion?

Congo. (1)

EinZweiDrei (955497) | more than 7 years ago | (#17000426)

As much as my first instinct is to agree that productivity is peaking or will soon, that would violate a deeper trend that I've learned very well to obey: Michael Crichton wrote about this over a decade ago, which on almost all instances translates neatly to 'It was alarmist then, and is alarmist now.'

Old dude (1)

Five Bucks! (769277) | more than 7 years ago | (#17000428)

Ok, so it's a superficial observation, but based on the author's picture in his blog, he appears to be an older fellow. Naturally people tend to slow down as they get older. I guess it makes me wonder if he cannot increase his performance anymore simply due to the fact that he has reached HIS limits, not humanity's limits. It's awfully bold of him to insinuate that he's an example of humanity's peak performance.

That said, perhaps in the current state of computers - interface with keyboard and mouse with a monitor for visual display - we have peaked. Do I think we will never experience a surge in productivity? I'm not that pessimistic about the abilities of our species.

StarCrack! (1)

Dareth (47614) | more than 7 years ago | (#17000606)

He needs to start playing StarCra(ft|ck) so he can get his "twitch" reflexes trained up. Of course once he goes down this path, productivity will be the least of his concerns.

Any reason an old dude can't compete with young Koreans at SC?

Re:Old dude (0)

Anonymous Coward | more than 7 years ago | (#17000670)

I am an old dude.

Stuff, whether sales, productivity, earnings, time management, production
et al, simply cannot expand indefinately.

This is the main problem. There are limits to everything. Oil reserves WILL
run out. Global warming coupled with Global Dimming WILL have serious effects
on the world. Walking, chewing gum and texting will start to drag down even
the most productive multitasker.

Corporations think this stuff is all bunk, and that we, as a species, will
continue to double our productivity ad nauseum. It ain't gonna happen.

Consider over farming. Using artificial means to contunire growing food
on land that SHOULD have been allowed to lay fallow for many years.

You cam augment things only so far before the profit/cost model kicks you
in the butt, Same with people.

This is NOT saying that some of do not 'produce' as best we can. It is merely
sating that we all have different limitations. Those limitations seem to
not have been considered in the article properly.

Re:Old dude (1)

ScrewMaster (602015) | more than 7 years ago | (#17001218)

And let's not forget that old dudes often get more done per unit time than younger ones that haven't a clue what they're doing. Experience counts, and in many areas counts for a lot even if the average beancounter is complete unable to account for it in any meaningful way.

Re:Old dude (2, Insightful)

SillyNickName4me (760022) | more than 7 years ago | (#17001292)

Consider over farming. Using artificial means to contunire growing food
on land that SHOULD have been allowed to lay fallow for many years.


Well, I don't know about that, but I do know the position the Netherlands holds in this list [mapsofworld.com] is pretty much due to a much higher productivity in agriculture being possible then is achieved almost anywhere else in the world. (Note that this productivity is achieved on a small part of a tiny and quite densely populated country, and by approx 60000 people (4% of the population of that country)).

In other words, a very dramatic increase of productivity is quite possible in agriculture, and happens where there is a real need or motive for it. I somehow doubt also that this is the end of such development.

Wrong presupposition (1)

meburke (736645) | more than 7 years ago | (#17000436)

AFAIK, computers have not been shown to produce a 10-fold increase in productivity. Productivity has been increasing slightly over 1% per year, and computer technology has been only a small part of it. It takes about 40 years or so for an invention to create a leap in productivity. This held true for the steam engine electricity, telephone, fax machine, etc., and each one of them changed substantially from the time of their invention to the time of elegant use. My guess is that computer aided intelligence IS the point at which productivity jumps as part of that substantial change.

Re:Wrong presupposition..sorry (1)

meburke (736645) | more than 7 years ago | (#17000450)

Uhhg..Should have read the parent closer. The article says 10-fold per decade...which is about right.

Re:Wrong presupposition (5, Insightful)

ThosLives (686517) | more than 7 years ago | (#17000516)

I'm just curious as to what is meant by 'productivity' anyway. I hate the numbers that are thrown around in the media. I want to see hard numbers like "bushels of produce per man-hour" and things like that - not something in silly relative units like dollars of economic activity (especially when a lot of economic activity is actually not 'productive' at all - for instance, selling a house in my mind is not productivity, but building a house is. Heck, if selling a house was 'productive', I could just keep selling a house back and forth between two parties and be the most productive real-estate agent in the universe - except that nothing actually changed. Note that I don't mean that selling a house isn't valuable; it's just not, in my mind, related to productivity).

Interruptions! It's not my fault! (1)

TheRealBurKaZoiD (920500) | more than 7 years ago | (#17000490)

Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent. I just can't work any faster.

I agree for the part. I am most definitely the slow part of the equation. However, in my own (any many others, I suspect) defense, I am interrupted a couple dozen times a day by people with matters so very, very trivial as to not even bear mentioning. I guess it's just something I'm going to have to learn to live with, since I want to have an open-door policy. But, come on, folks, read the documentation! Do a little research! This is programming! Sure, there are perfect answers, but who cares, as long as the business need is met! It's about productivity, not perfection.

In my own defense, I have learned to be very, very organized and resourceful with my time (forget for a moment I'm slashdotting), but still, I'm the slow part of the equation. Luckily for me, I'm the one of the faster workers here. I'm always waiting on someone else, so I suppose it balances itself out.

No man is an island (1)

bernywork (57298) | more than 7 years ago | (#17000496)

This is true to a degree, there is only so much that one person can do by themselves. Yes, there is tasks that can be completed by a single person, but it will get the point where we will change as people and they way that we work and we will be able to collaborate better. Virtual teams and the likes exist now, but we will still have a requirement for interaction. When mobile, although this is possible, it isn't as possible as face to face, this change will allow us to still give even more output while moving. AI will assist in this.

This is starting to filter through now, but will still take a little while more to get traction. Higher speed bandwidth for mobile users will allow this to happen more. Sitting behind a desk or sitting on a train won't have an impact as much as it does today. This will allow us to be more productive during our time away from our desks.

Having been to one of Peter Cochrane's talks before, and having spoken to him, I know this guy is many years ahead of the rest of us. Back when he was head of BT research I attended one of his talks when he was talking about WAP and RFID. WAP was the thing of the time, but BT research was doing so much with it at the time, certianly more than anyone els. RFID was still 10 years away. I still haven't seen everything come to fruition that they were talking about back then, but the basics of it is coming out now and I can see everything else that they were working on staring to come through as well.

If he has thought about this, I would love to know exactly what he is working on. If I am sure about anything, it's that he has got something up his sleeve which although won't give us the 10 fold increase, it will certainly help us on the way.

Re:No man is an island (4, Insightful)

pubjames (468013) | more than 7 years ago | (#17000604)

Having been to one of Peter Cochrane's talks before, and having spoken to him, I know this guy is many years ahead of the rest of us.

I've been to one of his talks as well. He is not years ahead of the rest of us, he is full of bollocks. Have you read one of BT's future predictions documents? (Which I believe come out of Cochrane's department) They are full of things like "in 20 years time, we will control computers with our minds, and we won't have lunch, we'll eat a pill!" If you find the stuff he says to be visionary, you don't have much imagination...

Re:No man is an island (1)

bernywork (57298) | more than 7 years ago | (#17000752)

From what I saw from his talk 10 years ago to today, I can see a lot of things that haven't played out and a lot of things that have. WAP was one of them. The interfaces sucked, and that was brought up. Everyone is using wireless data (In all forms) to get access to their standard interfaces essentially. Wireless is getting faster, and it's being used to replace a cable. That's not visionary at all, anyone could have seen that happening, we also know that the cable is always going to be faster as it's a known quantity, air is not.

Talking about RFID and it's uses and what it's going to be used for probably won't be as quickly adopted as some people hope, but at the same time, a lot of what they were working on is coming through. The business benefits are there. This is what was talked about, and time thus far is proving him / them (BT Research) right.

I never read one of the future prediction documents, so I am not in a position to comment.

Re:No man is an island (4, Informative)

pubjames (468013) | more than 7 years ago | (#17001046)

From this article on the BBC website [bbc.co.uk]:

The latest technology timeline released by BT suggests hundreds of different inventions for the next few decades including:

        * 2012: personal 'black boxes' record everything you do every day
        * 2015: images beamed directly into your eyeballs
        * 2017: first hotel in orbit
        * 2020: artificial intelligence elected to parliament
        * 2040: robots become mentally and physically superior to humans
        * 2075 (at the earliest): time travel invented

So, according to BT research, in 14 years time we are going to have computers sitting in parilament, in 34 years time there are going to be robots that are mentally superior to us and I may see time travel invented in my lifetime. Sorry if I don't take this stuff seriously. Wasn't it fashionable to predict this kind of thing in the 1950's?

Yes, some of their shorter term predictions are better, but I can make good short term predictions too.

Inflation (1)

Chemisor (97276) | more than 7 years ago | (#17000532)

Productivity (as reported by the BLS) is measured in dollars per hour per person. Since the Federal Reserve has the ability and the desire to expand the monetary supply without limit, productivity can likewise be increased without limit. Or, at least, as long as China keeps buying our treasury bonds...

Why (4, Insightful)

teflaime (738532) | more than 7 years ago | (#17000536)

do we need continued 10 fold increases in productivity? If we are a society that is going to require work from our citizens, then we need to provide work for our citizens to do. We only need increased productivity if we are, as a society, going to support at a reasonable level those persons who have been automated out of the work force and can't be retrained (and there are a lot of them). Business has a social obligation to support the societies that it parasitizes. Besides, if it doesn't support the society that it feeds off, soon it will have exhausted its food supply.

Re:Why (0)

Anonymous Coward | more than 7 years ago | (#17000704)

"We" are not a society, and hands off the income from MY business, freeloader. Thank you.

Re:Why (1)

tezza (539307) | more than 7 years ago | (#17000820)

You are right in terms society only _really_ needs enough work to keep everyone going.

Why people worry about productivity is because the World Economy is related to it.

Inflation is closely linked to productivity. If workers are more efficient, there is lower cost of paying them. Which mean that costs of services are lower, and inflation is stopped. Currently, it is looking as though producitivity can no longer be relied on to keep inflation low... i.e. a recession/depression is coming.

Also people like to think that miracles will come of increased productivity. Shareholders and investment bankers and Corporate Boards extoll increases in productivity to lower costs and increase value... Yadda yadda. If procuctivity hits a ceiling [likely] then their Blue Sky projections will fail, along with their stock valuations.

THAT is why they keep going on about productivity all the time.

Re:Why (1)

l0b0 (803611) | more than 7 years ago | (#17000824)

If we are a society that is going to require work from our citizens, then we need to provide work for our citizens to do.

And we will. Advances in science and technology have steadily created more jobs than the ones which have been made obsolete by them.

Re:Why (1)

bernywork (57298) | more than 7 years ago | (#17000912)

For a business to remain competitive it really does need to look at continual gains in productivity. A business also has a client base that grows, and a customer base to support. To support those users effectively you need IS. Try managing 1,000,000 users in an ISP environment without some form of automation. It's essentially impossible. It is impossible to do so while keeping a price that makes the product accessible. There unfortunately isn't a way around it.

Think about the associated human costs, are you going to pay double for your broadband if the ISP that you use employs double the amount of people to provide the same service?

The increasing amount of people in society provide additional business oppurtunities as well as additional jobs for people to do. Imagine trying to support a business of today on computers and business practices from 1990. Business as a concept itself hasn't changed much but Information Systems certainly have. These changes also bring about additional work in and of itself.

Overall, there are times when I question how much of a benefit of putting small businesses onto computers can be, but for larger ones or ones supporting large amounts of other people, it's close to impossible to avoid.

Brunner: The Limits of Human Endurance (1)

handy_vandal (606174) | more than 7 years ago | (#17000600)

"In the twentieth century one did not have to be a pontificating pundit to predict that success would breed success and the nations that first were lucky enough to combine massive material resources with advanced knowhow would be those where social change would accelerate until it approximated the limit of what human beings can endure."

John Brunner, The Shockwave Rider [wikipedia.org]

Too Much Information? Bollocks! (3, Interesting)

f00Dave (251755) | more than 7 years ago | (#17000616)

Sounds to me like the old "information overload" phenomenon. The solution-pattern to this situation is never going to be found via incremental improvements in information processing, as the growth is exponential. Nor will an "add-on" approach solve the problem; while hyperlinks, search engines, and other qualitatively-impressive tools are awesome in their own right (and do help!), they only add a layer or two to an information-growth process that adds layers supralinearly ... they're another "stop-gap measure", though they're also the best we've come up with, so far.

So how to solve an unsolvable problem? Rephrase it! IMO, the problem isn't "too much information", as that's already been solved by the "biocomputer" we all watch the Simpsons with: our senses/brains already process "too much information" handily, but with lots of errors. No, the problem is that we're using the wrong approach to what we call "information" in the first place! We're rather fond of numbers (numeric forms of representation), as they've been around for around eight thousand years, and words (linear forms of representation) go back even farther. Pictures, music, etcetera store far more information (qualitative, structural forms of representation), but usually get mapped back to bitmaps, byte counts, and Shannon's information theory when this discussion starts. And that's the heart of it right there: everyone assumes that reducing (or mapping) everything to numbers is the only way to maintain objectivity, or measure (functional) quality.

Here's a challenge: is there a natural way to measure the "information-organizing capability" of a system? Meaning some approach/algorithm/technique simple enough for a kid or grandparent to understand, that most human beings will agree on, and that puts humans above machines for such things as recognizing pictures of cats (without having to have "trained" the machine on a bajillion pictures first). [Grammars are a reasonable start, but you have to explain where the grammars come from in the first place, and what metric you want to use to optimize them.]

A constant insistence/reliance on numeric measurements of accomplishment just ends up dehumanizing us, and doesn't spur the development of tools to deal with the root problem: the lack of automatic and natural organization of the "too much information" ocean we're sinking in. If we're not a little bit careful, we'll end up making things that are "good enough" -- perhaps an AI, perhaps brain augmentation, [insert Singularity thing here] -- as this is par for the course in evolutionary terms. But it's not the most efficient approach; we already have brains, let's use 'em to solve "unsolvable" problems by questioning our deep assumptions on occasion! :-)

Disclaimer: the research group [cs.unb.ca] I work with (when not on "programming for profit" breaks, heh) is investigating one possible avenue in this general direction, a mathematical, structural language called ETS, which we hope will stimulate the growth of interest in alternative forms of information representation.

Peter Cochrane reads too much sci-fi (1)

Viol8 (599362) | more than 7 years ago | (#17000636)

I remember reading a column Peter Cochrane used to write in a newspaper many moons ago. IMO the man is definately a paid of member of the Kevin Warwick (Reading uni , "notorious" AI professor) Pie In the Sky Club whereby they both take bog standard science fiction topics that can be found in 101 paperback books written in the last 40 years, mix in a large amount of 1960s technology can solve any problem attitude , ignore any negative aspects or complicated social issues of what they're proposing and then set a deadline for it to happen about 10-20 years in the future , so far enough away that it seems plausable and also far enough so anyone will have forgotten what they said if they got it completely wrong.

much unblocking left to do (1)

m0llusk (789903) | more than 7 years ago | (#17000676)

Maximizing throughput is just one aspect of productivity that computers are involved with. There is now enough information and research available on almost any topic, even highly specialized ones, that storing, managing, and searching records is becoming increasingly critical.

One strong example of this is the human genome which we now understand in great detail, indeed just enough detail to begin the long process of coming to understand how all that genetic information actually works. If this were a simple matter of examining genes as quickly as possible then we might already be done, but the complexity of the system is such that even with great progress research into phenomena like human development and disease can be expected to take decades and occupy many of the finest minds. Productivity in this context is most strongly related to being able to retain and bring together key elements of information.

The speed with which work is done is possibly the least interesting aspect since science repeatedly shows us that expanding relevant knowledge does not depend on how aggressively one explores wrong answers.

New World of Work (1)

mark99 (459508) | more than 7 years ago | (#17000690)

Microsoft's "New World of Work" initiative is all about this. If you look beyond its short term goals of selling and deploying more Office Software, there is a very compeling vision of the future, with widescale automation of low-value tasks. There is an extremely cool BMW video around this, with not a single MS logo in sight, but some ultra-cool hardware (desks and walls that are montitors with optimal transparency) that makes "Minority Report" look terribly crude.

Of course nobody can deliver on this today, but there is a lot of investment going on at MS and elsewhere. I would love to work with that stuff today. And I suddenly see the value of "glass" in Vista, but it has a long way to go.

Would be cool if some OSS software got there first!

HW may slow it's pace... (2, Funny)

Jennifer York (1021509) | more than 7 years ago | (#17000696)

but I see a SOA (Services Oriented Architecture) solution to the problem. These building blocks will be used to scale the productivity of the developer. As more and more services like Flickr, Google Maps, and the like continue to provide key services to the developers, our mashups will become more compelling. Just remember to make your mashup a service so that someone may build upon it when you are done.

Variance amongst workers (1)

tezza (539307) | more than 7 years ago | (#17000714)

Joel Spolsky [joelonsoftware.com] makes some good arguments about the best programmers being siginificantly more productive than the rest.

In France, the government found that some surgeons were able to acheive 12x as many procedures as others at the same quality level[1]. This is the basis of the NHS reforms in the UK [bbc.co.uk]. i.e. Provide a system to encourage the 12x surgeons/other-staff to succeed.

So one way to increase productivity is to identify those 12x people, and find less demanding work for the <12ers. This is done by hocus-pocus Management Consultant type processes at the moment. Correcting this would lead to improved efficiency overall.

How much improvement? Dunno, I'm not a 12x person either!

----
[1] I read this in The Economist [economist.com]. Sorry, they require registration+£££ to read the article.

Depends on your profession (3, Interesting)

Pedrito (94783) | more than 7 years ago | (#17000790)

Sure, for most people, productivity isn't going to increase 10-fold. Hell, as a software engineer, I can't imagine getting 10 times as much stuff done in the same period of time anytime soon. Faster computers wont' help and about the only thing that would speed up my productivity as a programmer is software that would write the code for me, putting me out of a job.

There are a lot of people working in the sciences who think differently, though. Chemists, biologists, physicists, could all do well with, not just smarter programs, but faster computers. As a couple of simple examples: Molecular mechanics modeling for chemists and protein folding modeling for biologist (particularly the latter, and both are related), are insanely computationally intensive and if computers were able to provide the results in 1/10th or 1/100th of the time, it would make a big difference in their ability to get things done. So I think it kind of depends what you do. I mean, let's face it, if you're a secretary, a faster word processor isn't going to make you 10 times more productive. Maybe a faster copier would help...

Re:Depends on your profession (1)

bernywork (57298) | more than 7 years ago | (#17000970)

You spend 100% of your time coding?

You don't spend any time searching for solutions to problems, dealing with customers?

Not knowing your response, do you think by any chance that it would be possible to save time doing the above?

More important is that productivity has bought (0)

Anonymous Coward | more than 7 years ago | (#17000830)

us free time. What we do with the free time will determine our future.

Chunks / Levels of abstraction (1)

SpinyNorman (33776) | more than 7 years ago | (#17000836)

There are human limits on things like how many items we can simulataneoulsy hold in short term emmory (~7) or how fast our brain works, but that doesn't equate to a limit on "productivity". The key here is chunking and levels of abstraction - we can overcome the number of items we can manipulate by "chunking" simpler items into groups that we then consider as a whole (e.g. memorize a phone number as 3 chunks vs 10 digits), and can gain power in our thinking by thinkign at a higher level in terms of more powerful concepts composed of simpler ones. For example, it'd be a lot more productive to design a new spaceship if you can operate at the level of "attach a type X propulsion unit to a type Y living unit" rather than doing low level design, but achieveing this sort of productivity gain requires a lot more intelligence in how components are designed for this sort of higher level usage.

It's gotta do with the hardware (1)

cpux (970708) | more than 7 years ago | (#17001012)

As far as traditional solid state electronics goes, we are reaching a bit of a threshold as to what we can do. Current tech uses 65 nm length transistors. The next generation will be using 45 nm transistors, which is at a point that it becomes very intolerant to faults due to cosmic rays causing noise. As a result, hardware design is focusing a lot more on fault tolerant designs, which are slower.

Define 'productive' for a CTO (1)

sugarman (33437) | more than 7 years ago | (#17001214)

We're at the point where pretty much everyone is familiar with e-mail and a web browser now. We're a long way from the end of productivity. When the regular office staff are able to run a query on the fly to get the data they need, rather than manually cutting and pasting from some predesigned job because the legacy systems don't interact, we might get closer.

There is still a ton of manual busy work that could be automated or sped up in most corporations. There's a lot more that could be done.

I've been in the business twenty five years (2, Interesting)

hey! (33014) | more than 7 years ago | (#17001250)

I've never seen or heard of anything like a blanket ten fold increase in productivity come from the introduction of a new system or even new technology. Perhaps in certain tasks were speeded 10x, but he volume of revenue generation does not increase 10x. Of course there are cost reductions by staff reduction, but for some reason it seems rare to have large scale downsizing as a result of introducing IT (as opposed to new manufacturing technologies or new business practices).

Mostly we are talking about marginal improvements -- although these are often not to be sneezed at. Margins are where competition takes place; they're where they difference between profitability and unprofitability, or positive cash flow and negative cash flow are determined. For things that are done on massive scales, marginal improvements add up. But even doubling actual productivity?

What IT mainly does is shift expectations. When I started work in the early 80s, business letters and memos were typed. Now we expect laser printed output or emails. A laser printed letter doesn't have 10x the business impact of a typed letter. An email gets there 1000x faster than an express package, but it seldom has 1000x the business impact when looked at from the standpoint of the economy as a whole. You only have to use email because your competition is using it as well, and you can't afford a competitive differential in speed.

Many changes created by information technology are imponderable. For example, one great difference between the early 80s and today is that there are far fewer secretarial staff. Memos and letters used to be typed by specialists who often were responsible for filing as well. Now these tasks are most done by the author, arguably eliminating a staff position. On the other hand, the author spends much more time dealign with computer and network problems; not only is his time more expensive than the secretarial time on a unit basis, he also needs the support of highly paid technical staff.

Some technology mediated changes are arguably negative: We used to set deadines for projects based on the delivery time plus a margin for delivery. Now it's common for proposals and reports to be worked on up to the last possible minute.

There are, no doubt, many substantial business savings created by new practices enabled by technology. Outsourcing, specialization, accurate and precise cost of sales, these are things that over time may have a huge impact.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...