Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Michio Kaku's Dark Prediction For the End of Moore's Law

timothy posted more than 3 years ago | from the techno-malthusian dept.

Hardware 347

nightcats writes "An excerpt from Michio Kaku's new book appears at salon.com, in which he sees a dark economic future within the next 20 years as Moore's law is brought to an end when single-atom transistors give way to quantum states. Kaku predicts: 'Since chips are placed in a wide variety of products, this could have disastrous effects on the entire economy. As entire industries grind to a halt, millions could lose their jobs, and the economy could be thrown into turmoil.'" Exactly the way the collapse of the vacuum tube industry killed the economy, I hope.

cancel ×

347 comments

Sorry! There are no comments related to the filter you selected.

Good Morning (-1)

Anonymous Coward | more than 3 years ago | (#35549316)

It's been years since my last first post

Re:Good Morning (-1)

Anonymous Coward | more than 3 years ago | (#35549324)

It's been minutes since I fucked your mother. What a slut!

Re:Good Morning (3, Funny)

Anonymous Coward | more than 3 years ago | (#35549950)

DAD?

Re:Good Morning (-1, Redundant)

De_Boswachter (905895) | more than 3 years ago | (#35549350)

It's been years since I read Slashdot comments with (Score: 0). Maybe I'll read another after the next ten years.

Re:Good Morning (1)

Arivia (783328) | more than 3 years ago | (#35549564)

A really good tip an AC gave me ages ago: change troll and flamebait mods to +5 and browse at -1. A lot of good comments get modded off for people disliking them. 0 isn't the end of all quality, you know. Also: holy crap, Slashdot's over 2400000 accounts? I suddenly feel ancient, and I'm only 22!

No planetary alignment? (5, Funny)

andreicio (1209692) | more than 3 years ago | (#35549336)

Noone will take a disaster prophecy seriously if you can't even be bothered to pair it with some planetary alignment or ancient calendar.

Re:No planetary alignment? (1)

rossdee (243626) | more than 3 years ago | (#35549506)

Or Nostradamus

Re:No planetary alignment? (0)

Anonymous Coward | more than 3 years ago | (#35549756)

I would not think that Peter Blair Denis Bernard Noone (http://encyclopediadramatica.com/Noone) is that simple minded :(

Re:No planetary alignment? (2)

msauve (701917) | more than 3 years ago | (#35549912)

"Noone will take a disaster prophecy seriously..."

What do Herman's Hermits [peternoone.com] have to do with silicon technology disasters?

On vacuum tubes. (2)

rnws (554280) | more than 3 years ago | (#35549338)

The major difference being the tube/valve industry was done in by the transistor - i.e. we had a viable replacement that was better. The problem with the transistor is that we don't (yet) have a viable replacement.

Re:On vacuum tubes. (5, Insightful)

frnic (98517) | more than 3 years ago | (#35549412)

Before we had transisters we didn't have them yet either.

Re:On vacuum tubes. (1)

WrongSizeGlass (838941) | more than 3 years ago | (#35549440)

The major difference being the tube/valve industry was done in by the transistor - i.e. we had a viable replacement that was better. The problem with the transistor is that we don't (yet) have a viable replacement.

There's a big difference between then and now. We have a lot of people/companies/countries trying to drive the progress and development of new technologies. Small startups can play a role, or even become the new leaders; ungodly international conglomerates can 'change or die'. There's a brave new word out there ... but there's always a brave new world out there. Now you young folks go get it and bring it back to those of us who are tired, cranky and complacent.

Re:On vacuum tubes. (5, Insightful)

maxwell demon (590494) | more than 3 years ago | (#35549454)

So what? Already today the chips are just perfect for most applications. Add 20 more years of Moore's law, and we won't even need more powerful chips. You'll have the power of today's supercomputers on your cell phone. I doubt Moore's law would continue even if physically possible, because there will be no need for it.

Re:On vacuum tubes. (1)

frnic (98517) | more than 3 years ago | (#35549474)

Enough is never enough, but we will find a way to continue to advance.

Re:On vacuum tubes. (0)

Anonymous Coward | more than 3 years ago | (#35549534)

Bring on the bloat!

Re:On vacuum tubes. (1)

wisty (1335733) | more than 3 years ago | (#35550118)

Otherwise, we'll never be able to play Crises 16 on Windows 2030.

Re:On vacuum tubes. (0)

Anonymous Coward | more than 3 years ago | (#35549488)

Add 20 more years of Moore's law, and we won't even need more powerful chips.

We won't need chips at all! The new computers will be atom-sized, and gosh, everything already has atoms in it!

Re:On vacuum tubes. (1)

Covalent (1001277) | more than 3 years ago | (#35549520)

+5 FTW How much smarter does the refrigerator need to be? Is the "perfect" refrigerator possible within the limits Kaku proposes? I have to think the answer is yes. Therefore, for most applications, Moore's Law is irrelevant. This might cause problems for supercomputing, but for most applications it's a non-issue.

Re:On vacuum tubes. (2)

jbolden (176878) | more than 3 years ago | (#35550092)

Right now its hard to get refrigerators that maintain proper temperature at different points. Having a system that can manipulate airflow based on what's inside it: i.e. is running a fluid dynamics program, is taking pictures and analyzing them of its internal contents, is offering an interface to your computer...

There is nothing like that on the market, and yes it would be a huge economic value. Keeping food at the right temperature allows people to store better foods which can lead to them buying more sensitive foods which taste better but aren't sold because they would be ruined by the cheap equipment in most houses.

Re:On vacuum tubes. (4, Insightful)

wierd_w (1375923) | more than 3 years ago | (#35550094)

amusingly, that only confirms Kaku's prediction.

If your existing refrigerator is perfectly good, then what incentive do you have to buy the NEW refrigerator?
If you don't buy NEW refrigerators, how does the refrigerator manufacturer stay in business?

For a more geek friendly variant on this, look at microsoft. Their last 3 "New" versions have mostly been about Microsoft's bottom line, and been less about true innovation. (EG--look how hard they are trying to kill windows XP.)

When you reach a point where your company can no longer just add bloat, call it new, and sell them like hotcakes because the existing product is arguably just as good, if not better, due to physical limitations of the device, then you end up with profitability grinding to a halt, and industry suffering mightily.

What you would see instead, is a service-industry created, instead of a product-industry.... Oh wait, we already are!

Re:On vacuum tubes. (2)

mwvdlee (775178) | more than 3 years ago | (#35549522)

Until even the most complex task imaginable can be computed in less time than it takes you to click a button, there will be a need for more processing power.

Re:On vacuum tubes. (2)

kdemetter (965669) | more than 3 years ago | (#35550108)

Yes, and this 'need for more processing power' , is exactly what Moore's law exploits : Moore's law basically dictates that the demand for processing power doubles every year.

As a result , it's most profitable to follow this demand.

Speeding it up would be silly ( even if new technology would allow it ) , because that means you lose money :

For example , if i suddenly were to create a processor which has 10.000x the processing power , i would go brankrupt :

- Either it would be so expensive , that no one would buy it , because no one would need that much processing power anyway .
- If it would not be that expensive , than everyone might buy it . But afterwards , it would be many many years before anyone needed a more powerful processor , and i'm not making much money in the mean time.

Moore's Law ensures that every year people will find that their computer is too slow , and they will buy a new one , which in turn provides revenue for the manufacturers.

Re:On vacuum tubes. (3, Funny)

Anne Thwacks (531696) | more than 3 years ago | (#35550186)

Moore's Law ensures that every year people will find that their computer is too slow

No - Microsoft does that. Moore's law ensures that new computers can perform better at the same rate that MS adds bloat to their software, or marginally faster. By avoiding the use of Windows, I can continue to use my 4 year old PC or ten year old Sparc machines. YMMV

Re:On vacuum tubes. (0)

Anonymous Coward | more than 3 years ago | (#35549546)

Today's supercomputers are one million times more powerful, which takes 30 years of doubling each 18 months, not 20.

Re:On vacuum tubes. (0)

Anonymous Coward | more than 3 years ago | (#35549656)

Well, if there continues to be a demand for ever more accurate simulations and more powerful AI, then you want as much processing power as possible.

Keep in mind the combinatorial explosion factor, which means there a lot of things even today's supercomputers cannot calculate. Such as every possible chess game with 40 moves or less. And that's just a board game.

Re:On vacuum tubes. (0)

Anonymous Coward | more than 3 years ago | (#35549768)

One of these days a true successor to Java will come along, and you will eat those words :P

Re:On vacuum tubes. (5, Interesting)

jbolden (176878) | more than 3 years ago | (#35550072)

Today's chips were perfect for most applications in the 1980s. Once WordPerfect could outrun a human in terms of spell check and could outrun even the fastest printers CPU upgrades didn't do much. Same with Lotus 1-2-3, once complex speadsheets with lots of macros could be processed faster than a human could read a spreadsheet....

But all that excess power led to the GUI. And then technologies like OLE. Which drove up requirements by orders of magnitude. But OLE hasn't really hit another generation because everything is so unstable. Imagine the next generation of applications that have data embedded from dozens of devices and hundreds of websites. I do a Quicken report which

a) contacts my banks internet connections and pulls in all the credit card transactions
b) hits each of those vendors (100+) with the credit processing number and pulls up all the items for each transaction
c) does an item lookup to figure out what sort of expenses they are and prorates out general costs, like sales tax. That's 1000s of web information requests for an annual report.

That sort of data processing we don't yet have and certainly not on cellphones. Another area is AI where systems are underpowered.
Imagine a news search engine that knows my entire browsing history. Like a Pandora across all my news choices for the last year. I search for a story and because the system knows my preferences on dozens of dimensions its able to feed me the stories that most fit my preferences. Analyzing every article every day to do simple word counts is about the limits of a massive datacenter of google. Analyzing every article every day to determine: how much scientific background is this assuming in biology, in chemistry, in mathematics; what sort of editorial biases does it have, how human interest heavy is the presentation, how respected is in the journal.... that's way beyond what we can do today.

Stupid comment (0)

Anonymous Coward | more than 3 years ago | (#35549344)

Take one paragraph, take it entirely out of context, and add a sarcastic line, and miss the point entirely. Timothy didn't even bother to RTFA perhaps? Kaku has a pretty high degree of credibility, he is an excellent write. Shame on you, go and read what the man actually says.

Re:Stupid comment (0)

Anonymous Coward | more than 3 years ago | (#35549438)

The man says, and I quote, "I never saw a spotlight I didn't love."

Re:Stupid comment (0)

Anonymous Coward | more than 3 years ago | (#35549608)

The man says, and I quote, "I never saw a spotlight I didn't love."

no that's Jesse Jackson and Al Sharpton. the "reverends".

Re:Stupid comment (2, Insightful)

Anonymous Coward | more than 3 years ago | (#35549958)

Uh, no. He's a gawdawful write. The entire excerpt was a dreary and largely useless lead-in to the final paragraph. Kaku writes not as if he believes in using two word where one will do but in using a hundred words where one will do.

And what does the reader get when you slog your way through to the last paragraph? The shocking news that quantum effects will put an end to conventional integrated circuits.

Jiminy Cricket! I wish I was smart enough to make that prediction! It's only been common knowledge in the tech community for a couple of decades. Maybe there's a Nobel Prize for belaboring the obvious that Kaku's going for.

The implication of the article, which Kaku's smart enough not to get too explicit about, is that when that sad day arrives AMD and Intel - they'll still be the only two microprocessor manufacturers of any note - will produce their final chips none of which will work. Oh, the tragedy! Oh, the humanity! Oh, if only they'd listened to Michio Kaku while there was still time!

Of course long before then Kaku will have cashed the checks from this piece of drek.

All the phony Luddites who moan about the arrogance of technophiles will have had their conceits confirmed that technology is the crystalization of hubris. That's probably what they're tweeting each other right now on their Iphone 2s.

Meanwhile, back in the real world Kaku's dark prognostications will be forgotten in less time then it takes AMD and Intel to produce the next generation of microprocessors.

This is typical!! (1)

crank-a-doodle (1973286) | more than 3 years ago | (#35549346)

Every time a new technology comes along, there is always this lingering fear that it might kill the economy, but as it turns out every time, new jobs are created and std of living of ppl is raised! so chilllax!

Re:This is typical!! (0)

ZankerH (1401751) | more than 3 years ago | (#35549376)

Axcept this time there is no new technology (as of yet), just the old one running into physical limitations. If we can find something better in the next 20 years, good. Otherwise, the only way to make faster computers from that point on is to make them bigger.

Re:This is typical!! (1)

crank-a-doodle (1973286) | more than 3 years ago | (#35549408)

by new tech, i was referring to quantum computing.....

Re:This is typical!! (0)

Anonymous Coward | more than 3 years ago | (#35549712)

"Axcept"? Go kill yourself, now.

Re:This is typical!! (1)

Patch86 (1465427) | more than 3 years ago | (#35549900)

Last time I replaced my both my desktop and my laptop is when both of them were knackered from punishing overuse (9 years and 5 years respectively). Same goes for my last phone, and my last TV. If the only replacement computers available were essentially the same as my old ones, I'd still have replaced them, the same as I would a broken oven with a new (but not substantially improved) oven. And incidentally, the biggest difference between my new CPUs and the old CPUs were in the number of cores- something which Moore's Law doesn't specifically deal with- the computers of the future might be 32 core monstrosities, with 8 bits or silicone wired up with optical connectors.

The industry won't collapse, although it might have to slow down as the replacement cycle eases up. And there are lots of other components (internal connectors and whatnot) that can be incrementally improved for the computerphiles. Alas, computerphiles might even end up down the same dark route as audiophiles, with their magic-coated copper wiring.

Dark predictions (4, Insightful)

Wowsers (1151731) | more than 3 years ago | (#35549348)

I predict a dark future for Michio Kaku's new book.... namely, the bargain bin.

Re:Dark predictions (0)

JamesP (688957) | more than 3 years ago | (#35549540)

Funny

I got "Bringing Down the House" from the bargain bin once. The author wrote a book after that one, called "The Accidental billionaires"

So, having your book in the bargain bin is not that bad =P

Re:Dark predictions (2, Informative)

Anonymous Coward | more than 3 years ago | (#35549682)

Yeah, it's embarrassing when someone who's brilliant within his area of expertise starts nosing into other fields (in this case economics and the electronics industry) just to say stupid things. By the way, he did this before, although the previous victim was biology. Why do physicists think they are masters of all sciences? [scienceblogs.com] Granted, that was in response to a question, but he really should have said ‘I have no clue’. Why oh why do experts always think they're experts in everything?

and this is a bad thing? (5, Funny)

Hazel Bergeron (2015538) | more than 3 years ago | (#35549354)

Software developers are going to have to consider increasing efficiency as they make their wares more complex! And we might have to actually implement concurrency research which is under two decades old!

Who knows, we might even end up with the responsiveness of my RISC OS 2 Acorn A3000 in 1990.

Re:and this is a bad thing? (1)

Anonymous Coward | more than 3 years ago | (#35549378)

You mean we're supposed to do something other than cram a thousand different abstraction layers into an application (e.g., JBoss Seam projects)?

Re:and this is a bad thing? (0)

Anonymous Coward | more than 3 years ago | (#35549862)

The abstraction layers serve a purpose. They exist to speed up development, which is still the limiting factor in computer applications. Anyway, the abstraction layers can exist because there is enough processing power. In most cases leaving out the abstractions would not mean we could use leaner computers: The processing power is needed for the fundamentally hard tasks. When you want to compress full HD video, you're going to either need dedicated hardware or a very fast processor. There's no point letting it go unused most of the time because you have super fast immediate software that took ages to develop. With this, you can also see why a brick wall limit for processing power is problematic: The size of fundamentally hard tasks can not grow without increases in processing power. Luckily Moore's law isn't about processing power: Most of these tasks are highly parallel, so shrinking transistors isn't the only way for scaling up processing power (and hasn't been for quite a while already).

Re:and this is a bad thing? (1)

jsprenkle (2009592) | more than 3 years ago | (#35550022)

Wow, there are two of us who figured out yet another abstraction layer isn't always a good thing

Re:and this is a bad thing? (1)

newcastlejon (1483695) | more than 3 years ago | (#35549452)

Who knows, we might even end up with the responsiveness of my RISC OS 2 Acorn A3000 in 1990.

Ah, my trusty old friend Acorn... what went wrong?

Re:and this is a bad thing? (0)

Anonymous Coward | more than 3 years ago | (#35549838)

Ah, my trusty old friend Acorn... what went wrong?

It may have been when they tried to sell a canary yellow computer called Phoebe.

Maybe IT will stop sucking up 10% of economy (-1, Troll)

shoppa (464619) | more than 3 years ago | (#35549356)

Right now circa 10% of the economy GDP is being wasted on IT.
See e.g. IT expenditures as a fraction of GDP [nationmaster.com] .
Imagine if someone else came up with a "new refrigerator" and the efforts on maintaining the "new refrigerator" came to suck up 10% of the economy.
There would be a "slashdot for refrigerators" and a bunch of nerds would be lamenting how they are actually helping the econmy by sucking up 10% of it.

Re:Maybe IT will stop sucking up 10% of economy (4, Insightful)

sydneyfong (410107) | more than 3 years ago | (#35549410)

Yeah, maybe we should stop the waste, and employ human operators to send telegraphs like they did in the good old days, scribes to write documents by hand....

Re:Maybe IT will stop sucking up 10% of economy (0)

Anonymous Coward | more than 3 years ago | (#35549600)

I don't think that was his point either, although "waste" is a bad wording. Consider this: back in the days everyone were farmers. Farming was, and still is, a requirement for society to go around, but in modern developed countries we don't need to allocate all our resources into agriculture to keep it up. If we can keep IT in the future with human resource consumption at a minimum, it's a win-win situation for everyone. Disregard GDP and all other fancy abstraction; humans are, after all, society's only resource, and as of such it's a highly precious one.

Re:Maybe IT will stop sucking up 10% of economy (3, Insightful)

Anonymous Coward | more than 3 years ago | (#35549424)

Really, IT has far more wide ranging applications than a fridge and can create new ways of doing things, these may not always be better but a good proportion of it is. People who think that IT is a waste are usually the same people that think the space program is a waste or that education is a waste. Progress has to come from somewhere, it is not magiclly pooed from the buts of celebrities or political figures as they dance about appealing to the masses.

Um, refrigerators use a lot of energy. (1)

Colin Smith (2679) | more than 3 years ago | (#35549426)

I take it that you are too young to pay the electricity bill... Basement? Cooler down there?
 

Re:Maybe IT will stop sucking up 10% of economy (2)

ZankerH (1401751) | more than 3 years ago | (#35549464)

A "new refrigerator" is, supposedly, more efficient than the last one. The emergence of IT made entire armies of secretaries, messengers, archive managers, human computers etc obsolete, changing society profoundly. The comparison to an iterative development of an existing technology strikes me as moot.

Re:Maybe IT will stop sucking up 10% of economy (2)

WrongSizeGlass (838941) | more than 3 years ago | (#35549492)

Imagine if someone else came up with a "new refrigerator" and the efforts on maintaining the "new refrigerator" came to suck up 10% of the economy.

How big of an LCD will this fridge have? Will it have USB 3, Thunderbolt or Gigabit Ethernet? How about WiFi, a full Bluetooth implementation or this new fangled NFC stuff? Will my better half be able to hook up a scale that not only weighs me before I open the fridge but after to see exactly what I took out? Will a pre-recorded movie play that tells me I shouldn't be eating whatever I just took out, reminding me of my diet or just asking "are you going to bring me one, too?" What about commercials? "I see you're running low on Pepsi 3000! You should go buy some more Pepsi 3000! Now!! But wait, it's a long trip to the store and you may get thirsty - why not have a Pepsi 3000?" Will I then here her voice telling me "put that back! Have some fruit instead!"?

Re:Maybe IT will stop sucking up 10% of economy (2)

TheVelvetFlamebait (986083) | more than 3 years ago | (#35549606)

Wait, does that mean I've been wasting the 20-30% of my budget that I spend on food? I sure am going to miss it. Oh well, at least my pastime of throwing dollar coins at drains only costs me about 2% of my income and is therefore not wasteful.

Seriously (1)

Epeeist (2682) | more than 3 years ago | (#35549364)

Does anyone pay any attention to Michio Kaku? He isn't quite as much a woo merchant as Deepak Chopra, perhaps one could compare him to the likes of Henry Stapp or Fritjof Capra.

Oh really? (3, Insightful)

Anonymous Coward | more than 3 years ago | (#35549384)

Apparently people can't:
make cluster computers
make boards with multiple chip sockets
make extension boards that go in PCI or potential future slots
use distributed computing
[insert many other ways to get around limited processing power]

Man, we sure are screwed in 20 years time, computers will start rebelling against us because we can't make them smaller than the boundaries of the universe!

On a more serious note, this is retarded. Period.
20 years is a VERY long time.
By then, we'd probably actually have the beginnings of working quantum computers that are useful.
By then, we'd have almost certainly found out how to get around or deal with these problems, possibly even taking advantage of quantum effects to reduce circuit complexity and power needs.
Who knows, but i know one thing for sure, the world won't end, life will go on and usual, and this book will still be shit.

This is a perfect example of the world today (4, Insightful)

gearloos (816828) | more than 3 years ago | (#35549400)

Michio Kaku is not necessarily the best in his field, mediocre at best, but he has the biggest voice. I was talking to an older woman awhile back and she is a devoted fan of his. I asked her what she knew of him other than that he does "layman's" break down commentaries of Physics for the discovery channel and she actually thought badly of me for trying to undermine her opinion of "the top physicist in the world today". Well, that's definitely HER opinion and not mine. Just because he has a big mouth (media wise) does not make him remotely right on anything is the point I'm trying to make here. oh, I just got it- Now I understand Politics lol

Re:This is a perfect example of the world today (0)

Anonymous Coward | more than 3 years ago | (#35549586)

What's wrong with science popularizers? Isn't it a good thing that some scientists actually bother to try and educate the public?

Re:This is a perfect example of the world today (2)

FrootLoops (1817694) | more than 3 years ago | (#35549588)

I'd bet most of the top people in their field don't take the time to make their field publicly accessible. Steven Hawking comes to mind as a counterexample with a few books, but I can't think of a single mathematician counterexample. My point is Michio Kaku doesn't have to be a "top physicist", and I wouldn't even expect him to be. That he popularizes technical stuff is enough for me.

He probably has a good point, too, that at least eventually Moore's law failing will have strong economic impacts, and it's unlikely that an exponential law can continue indefinitely.

Re:This is a perfect example of the world today (3, Interesting)

Fnkmaster (89084) | more than 3 years ago | (#35549882)

Hawking isn't even a top physicist. I mean, he's a serious, good physicist, and an inspiring guy, just not one of the 5-10 best physicists alive today. Kaku on the other hand is just a popularizer. Which is fine. Except that the guy seems to be a hack and huge self promoter.

Gradual transition (4, Insightful)

Kjella (173770) | more than 3 years ago | (#35549404)

Sooner or later it will come to an end, but it will come slowly as the challenges rise, the costs skyrocket and the benfits are lower due to higher leakages and lifetime issues. And designs will continue to be improved, if you're no longer constantly redesigning it for a tick-tock every two years you can add more dedicated circuits to do special things. Just for example look at the dedicated high def video decoding/encoding/transcoding solutions that have popped up. In many ways it already has stopped in that single-core performance hasn't improved much for a while, it's been all multicore and multithreading of software. Besides, there's plenty other computer-ish inventions to do like laying fiber networks everywhere, mobile devices, display technlogy - the world will still be in significant change 20 years from now. Just perhaps not on the fundamental CPU code / GPU shader level.

Re:Gradual transition (1)

Sir_Sri (199544) | more than 3 years ago | (#35550048)

And there's a lot to be done with different architectures, and just plain organizing computers differently (basically changing up how the CPU, GPU and various memory systems connect to each other).

Right now all of that experimental stuff pretty much stays experimental, or custom, because by the time you get it out the door the traditional CPU-GPU market has gone through a tick-tock cycle and no matter how good your idea was, it's still not as economical as a newer version of your traditional hardware.

Once that progress slows there will be time for more innovation in how the computer does it's thing.

Oh and memristors might change the whole show anyway. They might not, and the same could be said for quantum information processing. In either of those cases even if you fairly rapidly reach a limit on die size you still have a whole lot of time to completely redesign the hardware to use those parts 'optimally' however one chooses to define that.

Re:Gradual transition (1)

jbolden (176878) | more than 3 years ago | (#35550154)

Excellent agree. I said a similar thing: http://hardware.slashdot.org/comments.pl?sid=2045520&cid=35549980 [slashdot.org]

The end of CPU/GPU density is nowhere near the end of the computers "getting better". Your points about all the display technologies and wiring are a good one. We still aren't using the CPU technology we have today in most devices.

His view (3, Insightful)

Anonymous Coward | more than 3 years ago | (#35549420)

His view is based upon the chip and not on the device.

What I'm seeing is folks (manager types ) using their iPhone as their business computer - eliminating the laptop and even their desktop. They're on the move and what they need is a portable communications device that also has some other apps.

Spreadsheets? That's for the back office staff. The same goes for anything else that you still need a desktop/laptop for.

So what's my point - desktops and laptops are becoming a commodity back office device (like the typewriter in the past) and the demand has stabilized and as far as business apps are concerned, there isn't any need for more power - bloatware aside.

To head off the "64K of RAM is all anyone really needs" comments, that was then, this is now. Back then, we were at the birth of the PC revolution. Now, we're in the mature commodity stage. Will we need more power in the future? Yes. But at Moore's law increases? Nope.

The future is efficiency, portability and communication.

PC's are inefficient for most uses; therefore, there won't be any "death" or "economic" destruction - just some "old" companies hitting the skids (Dell) or vastly changing their business if they can (HP).

Then its the time to start growing up the thing? (0)

Anonymous Coward | more than 3 years ago | (#35549470)

If we actually do need all this extra processing power, we can always build a 1x1x1 km big cube of atom sized transistors, and then time share it with the planet.

Just for example (0)

epayroll (2021644) | more than 3 years ago | (#35549472)

Just for example look at the dedicated high def video decoding/encoding/transcoding solutions that have popped up. In many ways it already has stopped in that single-core performance hasn't improved much for a while, it's been all multicore and multithreading of software. epayroll [jankovicllc.com]

Re:Just for example (0)

Anonymous Coward | more than 3 years ago | (#35549536)

Just for example look at the dedicated high def video decoding/encoding/transcoding solutions that have popped up. In many ways it already has stopped in that single-core performance hasn't improved much for a while, it's been all multicore and multithreading of software. epayroll [jankovicllc.com]

nice spam, there.

Atoms to hadrons and quarks (0)

Anonymous Coward | more than 3 years ago | (#35549480)

Atoms are composed of protons and neutrons which in turn are composed of hadrons and quarks. So we can used hadrons and quark based chips so Moore's law can go on.

Amplified Future Shock (1)

b4upoo (166390) | more than 3 years ago | (#35549490)

There has always been a suffering factor built into changes in technology spilling over and causing changes in society. Usually the suffering has been rather confined. The buggy whip workers were not such a large group of workers that the new automobile market destroyed. But now things are different and less predictable. A great example is in office staff eliminated by the cell phone. As cell phones took over the small company was able to get rid of millions of girl Friday types that had answered the phones and greeted walk in customers in the past. We are now displacing workers so quickly that economic chaos is descending upon us. After all, earners pay taxes whereas the unemployed consume tax dollars.
                          Back at home the US has no clue with what to do with labor. Just as we saw people suffer loss of life and health trying to save victims of 9/11 now we see brave,skilled, laborers in Japan entering death zones in order to save the public. I am certain that all that really matters to the republicans is that those workers do not have a union nor want help while they die from radiation sickness just as they will not take care of firemen and police with busted lungs and cancers from the WTC rescue efforts. Frankly if we suffer another attack perhaps our public employees will smarten up and stay far back from such catastrophic events.

Re:Amplified Future Shock (1)

FrootLoops (1817694) | more than 3 years ago | (#35549550)

This is OT, but did your buggy whip reference come from the movie Other People's Money?

Re:Amplified Future Shock (0)

Anonymous Coward | more than 3 years ago | (#35549888)

Very good points. You left out the parts where Republicans and their sympathizers don't actually mind catastrophes and disasters either, as long as nobody can convince the public that [insert crisis here] was something they should have seen coming and/or should have done something to prepare for. Witness their hysterical attack the messenger campaign against environmentalists or anyone who actually believes or wants to further research the notion that climate change might have anything at all to do with burning a lot of crap we dug out of the ground, destroying forests and polluting the oceans all at once. Now, it might not, but there's a lot of research that suggests there might at least be a connection, and the answer from conservatives is to try to stop the research and attack the researchers. Look at this thread and see how many posts deal with the subject matter at hand vs. how many personal attacks against the messenger are posted instead?

I wish debates these days could actually separate facts from opinions, but most people seem incapable.

New beginnings (0)

Anonymous Coward | more than 3 years ago | (#35549494)

The end is never really the end. Just the beginning of the next thing. What makes anyone think that the quantum level is the end of smallness?

I actually remember this from my university days (0)

Anonymous Coward | more than 3 years ago | (#35549512)

Admittedly it sucked but they did have a class in parallel programming that had some of the guys in the physics dept helping out. They did point out that eventually coders were going to have to do parallel programming and break up computation into chunks that could calculated simultaneously because of a little thing that would stop any single processor from getting faster. That being the speed of light.(IE you can only move data around so fast and when you have to move data around faster than light you have a big problem.) Admittedly this was over a decade ago and come to think of it most modern cpus are multi-core so looks like he had a point.

Parlellism (1)

mijelh (1111411) | more than 3 years ago | (#35549530)

Even if Moore's law come to an end, we can still improve the performance of the systems via parallelism.

Re:Parlellism (2)

maxwell demon (590494) | more than 3 years ago | (#35549580)

Even if Moore's law come to an end, we can still improve the performance of the systems via parallelism.

And by returning to writing efficient software.

Re:Parlellism (1)

swilver (617741) | more than 3 years ago | (#35549612)

Only up to a point, because only increasing by parallelism means increasing it by adding higher power requirements. If you think 300 W is a lot of power now, then it will be a heck of lot worse after we compensated Moore's law for a couple of years by adding more parallelism.

Re:Parlellism (2)

mijelh (1111411) | more than 3 years ago | (#35550006)

Moore's law is only about the number of transistors, not the efficiency. There's *A LOT* of improvement to be made in that area, regardless of whether or not we continue with miniaturization. We heard on slashdot a few examples, such as probabilistic pruning and others I don't remember. 300 W now doesn't mean we'll need 300 W for the same thing tomorrow. On the contrary, just check the energy consumption of your cell phone today vs a computationally equivalent computer 20 years ago
Still, we have some limits, such as Amdahl's law (basically, you can only speed up using parallelism the segments of code that are... well, parallelizable).

Re:Parlellism (1)

jbolden (176878) | more than 3 years ago | (#35550210)

Sure but the power used by computers today has gone down as we went from say 8" drives to, 5 1/4" to 3 1/2" to 2 1/2" inch drives. For laptops this is a problem. But lets say the average computer used as much electricity as the average room air conditioner how much of a problem would that really be? Especially if (given the cost) the system where sharing to dumb terminals all through the house so there was only one of them.

Not really new from him. (3, Interesting)

s-whs (959229) | more than 3 years ago | (#35549556)

He made similar economic predictions in the BBC Horizon episode "The dark secret of Hendrik Schoen" (2004).

That was the day I lost all respect for Kaku. His economic predictions are moronic (there will always be change, abrupt changes in what creates wealth), and in that Horizon documentary his comments seemed ludicrously off track as well.

Software design will get important again (0)

Anonymous Coward | more than 3 years ago | (#35549568)

Once hardware stops growing exponentially, software design and compilers will be optimized for performance. Little regard was spend in the last years to that because hardware would solve the problem. Once hardware performance stops to grow exponentially, software design and infrastructure will grow exponentially -- there is room there for at least another 20 years. Around that time quantum computing will change the world and will give rise again for yet another innovation cycle.

memristor-based analog computers (2)

mo (2873) | more than 3 years ago | (#35549590)

Even with transistors the same size, there are so many avenues to explore in processor design. Just off the top of my head, how about a memristor-based analog co-processor for tasks like facial detection or language/speech recognition. How about processors with asynchronous clocks, or clockless designs. Sure, they're harder to build, but once transistor sizes fixate, might as well spend the effort because designs will have a much longer lifecycle.

Kaku is a hack (4, Insightful)

thasmudyan (460603) | more than 3 years ago | (#35549636)

This guy is trying to establish himself as some kind of authority on futurism, but I just perceive him as an attention whore who actually contributes very little. Maybe I'm alone in thinking this, but his TV series "Physics of the Impossible" was one big self-aggrandizing marketing gig. I barely made it through two episodes that essentially consisted of the professor rehashing old science fiction concepts and passing them off as his own inventions. Every episode ended with a big "presentation" in front of dozens of fawning admirers. Before the credits rolled, they made sure to cram in as many people as possible saying how great and ground-breaking his ideas were. It was disgusting.

Are there physical limits to Moore's law? Sure. We already knew that. Circuits can't keep getting smaller and smaller indefinitely, and we have already run into the limit on reasonable clock speeds several years ago. And despite this, the computer industry hasn't cataclysmically imploded.

Re:Kaku is a hack (1)

Anonymous Coward | more than 3 years ago | (#35550120)

Exactly, what a sellout. I'm surprised there aren't more voices criticizing his bollocks (you're the first aside from me, on the whole web, usenet, the data sphere). He usually only gets airtime on FoxNews, Coast To Coast AM, etc. and that's where he belongs.

You have got to be kidding me. (0)

Anonymous Coward | more than 3 years ago | (#35549700)

Nut case.

Kaku is wrong on this one (1)

Eugenia Loli (250395) | more than 3 years ago | (#35549708)

While parts of technology might stop progressing as fast, other parts of technology will start getting optimized, to get over the halting of that other part. So if hardware stops getting faster, people will start optimizing software (which is currently extremely inefficient), until we get a better HW/SW tech at some point later in the future. There's a very nice comment on the Amazon page of the book by JPS, give it a read.

Re:Kaku is wrong on this one (1)

maxwell demon (590494) | more than 3 years ago | (#35549772)

Unfortunately I can't get to it from the German Amazon site (and I don't know how to get to the U.S. version of the site). Do you have a direct link to that comment?

Re:Kaku is wrong on this one (1)

tgd (2822) | more than 3 years ago | (#35550142)

On this one?

Other than techniques for self-promotion and publicity, what is he generally correct on? I mean, he's not Dr. Oz level of self-important whackjob, but that's not saying much.

Just silly (1)

Anonymous Coward | more than 3 years ago | (#35549816)

I fail to see how the inability to make smaller computers for ever will stop us from still needing them by the boatload.

I am a solid state quantum physicist (5, Interesting)

drolli (522659) | more than 3 years ago | (#35549818)

From weird analogies and a certain amount of misunderstanding things the excerpt draws strange conclusions.

a) Misunderstanding how the frequency spacing relates to required number of cycles: The correct assumption would be that if light has 10^14Hz and you restrict yourself to single-octave circuit (for the sake of simplicity: lets say 10% relative bw circuit), then you can if you "cram" ideall of modulate fast enough, 10^13bits*log2(S/N) bits per second. so probably 10^14bits/second - that is a lot.

b) limits to Moores Law: Moores law is an economic law. There is no physical limit which i see which can be reached technologically until 2020 (in mass production). There is a technological limit to what can be produced, but going in the third dimension and new materials will give opportunity to continue on the same course for a while. If you look at what physicists are currently looking at, you realize that the end of silicon/metal/oxide technology will not be the end of Moores Law or classical computing

c) "on the atomic level i cant know where the electron is". As it happens to be i work on quantum computation and i really hate to explain that: If you arrange a specific situation, then you cant know where the electron is on the atomic scale. If the statement would be as general as he makes it, it would be impossible to have different chemical configurations of the same stoichiometric mixtures. SIngle-molecule electronic/magnetic configurations. The quantum tunnel coupling in single molecule magnets between states can be designed, and i dont see a specific reason why it should be impossible to realize single molecule devices in which tunneling does not play a role

d) he does not understand FETs AFAIU

e) contrary to his opinion, very thin 2DEGs exist and i dont see a reason why upon (finding and) choosing the right layers, the confinement can be very steep in the third direction (not infinity, but also not requiring more than 50nm thickness)

The funny thing is that he forgot what already is and probably will (there *may* be ways out, like superconductors or ballistic transport but don't bet on it) really be a problem for all classical/room temperature computers: heat. While the designing smaller elements may be possible when using the right physics/technology, reducing the capacitances of lines (associated with an energy loss in the line resistance per switching) will be difficult. Once we *really* stack in the third dimension it will need a lot of clever computer scientists (and maybe mathematicians) to reduce thee needed interconnects, since otherwise stacking the third dimension wont give us anything besides memory capacity.

So to conclude: i believe until 2050 the definition of Moores law will be obsolete. but it will not break down because we are unable to make circuits smaller, but because it may be too expensive to make them smaller or powering and cooling the circuits may become impractical. We probably will have a replacement of moores law by an equivalent scaling law for power per switching.

Re:I am a solid state quantum physicist (0)

Anonymous Coward | more than 3 years ago | (#35550030)

Sorry, I think I was pounding out an angry comment titled "Kaku is a blight on science " while you were posting this. I should have just made it a reply to this. While the title to your post is fine, but it should be subtitled "And Kaku Isn't."

Re:I am a solid state quantum physicist (1)

drolli (522659) | more than 3 years ago | (#35550122)

I actually thought about a similar title. but i find in necessary to point out the own profession - unless the author of the article. That is because an electrical engineer or an chemist may have other conclusion - and i would find them very interesting. On the other hand i am really pissed that everybody who has hear the word "Uncertainity relation" believes that he can state things like "everything small must be quantum and tunnel" whereever he or she need to invoke the argument that classical physics does not hold (without understanding *when* it does not hold).

Re:I am a solid state quantum physicist (1)

mbone (558574) | more than 3 years ago | (#35550132)

I am curious about your take on quantum computers. My impression is that, if they are ever actually made into an operational product, they are likely to have a profound impact in certain areas (watch out, public key encryption!), but are unlikely to be much use in sending emails or watching videos.

Pseudo-economist (3, Insightful)

Boona (1795684) | more than 3 years ago | (#35549826)

Another pseudo-economist out to tell us that an increase in productivity and a lowering living costs will be a net loss for society. Michio Kaku can you please take an economics 101 class before writing a book about the economic impact of anything. The general population is already economically illiterate and this only fuels the problem. Thanks.

Moore's law will not end in disaster (1)

NoAffiliation (1987958) | more than 3 years ago | (#35549848)

The transistor will be replaced by 3d self organizing molecular circuits. Watch this ted talk by Ray Kurzweil from 2005, where he explains the whole paradime shift issue. http://www.ted.com/talks/ray_kurzweil_on_how_technology_will_transform_us.html [ted.com] "Inventor, entrepreneur and visionary Ray Kurzweil explains in abundant, grounded detail why, by the 2020s, we will have reverse-engineered the human brain and nanobots will be operating your consciousness." - Thats right, nanobots will be operating you consciousness.

Michio Kaku -- Go Away!!! (0)

Anonymous Coward | more than 3 years ago | (#35549966)

Damn ambulance chaser. I love physics; I despise Michio Kaku. He gives physicists a bad name.

He would have done better if he would have gotten his Phd in home economics.

Kaku is a blight on science (3, Insightful)

Anonymous Coward | more than 3 years ago | (#35549974)

Kaku is an embarrassment. In the mid/late 90s he presented himself as a "nuclear physicist" to the major news outlets (he is no such thing-he's a field theorist) and jumped on the till-then fringe protest movement opposing the launch of the Cassini mission. The opposition was based on the idea that the nuclear batteries on the probe posed a danger in the event of a launch accident. Nevermind that there had previously been launch accidents with the same battery type (military sats) and the ceramic uranium cores were simply recovered and _reused_ because they're practically indestructible. (The batteries are just warm bricks. Low level uranium fission keeps them cooking and thermoelectrics generate the juice. There are no controls to go wrong, no parts to break, nada. That's why they're used. The ceramic itself is terrifically tough.)

Anyway, Kaku saw the cameras and the bright lights and decided that he was a nuclear physicist and start spouting all sorts of total nonsense to frighten the unwashed masses. He has a long history of pretending to know things. Google "Kaku evolution blather" for another example. I watched him give a seminar once while I was in grad school and I just spent the hour squirming in embarrassment for him and his self-aggrandizement.

Yes, I loath the man. I'm a physicist and he just perpetuates the image of people in my field as hubristic egoists. He needs to be shouted down and run out of the media. There are lots of really good popularizers out there (DeGrasse-Tyson, Greene, etc) who, yes, need to establish a public presence to make a career, but who are also actually interested in facts and science and education and know their own limits.

So I just read the article (1)

jbolden (176878) | more than 3 years ago | (#35549980)

Just read the article, haven't read the comments yet.

Moore's law as far as CPUs and GPUs has already slowed down considerably this entire decade. As far as memory, so far as memory the chips aren't that thin yet. What this means is what everyone has been saying for a long time: more cores, more ram. More cores means applications need to be parallelizable. That's at least a one time overhaul of most of the world's code base.

Lets assume hardware improvements in general slow down. This leads to a hardware situation closer to what we had in the 1980s:

a) Because hardware is stable operating systems and applications can be written more efficiently to take greater advantage of the hardware. That means refactoring high level code into lower level code to get speed ups. The popularity of Java is basically based on rapidly changing underlying platforms, make platforms stable and we have language revolution with much less hardware abstraction. Compilers will get faster as well.

b) Because hardware is stable computers don't seem like as much of a disposable item. Getting a good quality system to keep for many years makes economic sense. So we can get a one time boost as people move back from $700 computers to $3000 computers, where they expect to get 10 years+ out of their computer.

c) In the area of cell phones we could see the same thing. While cell phones are too breakable to ever become extremely expensive (though there are people who get expensive cell phones now: https://store.vertu.com/en/ [vertu.com] ) if the platform were to stabilize we could see much richer client applications. If you expect to be on the same cell phone (with just hardware replacements) for a decade your willingness to buy expensive software goes up.

So lets say I don't agree with 2020, because around 2020 is when you start to see everyone upgrading. Which of course leads to software with much higher system requirements which drives more upgrades.... But maybe 2040 we have a stagnant computer technology industry if nothing interesting happens. I guess that could happen but I dont think its likel. However even if it did, this creates another advantage. You now have stagnant hardware, stagnant operating systems, stagnant languages, stagnant applications. An environment where corporate computing and custom code becomes a great value. And that is a huge and ever growing code base. So now we are out around 2060, where the industry is in maintenance mode. So the question is: between 2010 and 2060 do you think no one is going to come up with a really good idea?

Limits of growth in general (1)

TeknoHog (164938) | more than 3 years ago | (#35550096)

Moore's law is not so bad compared to the bigger picture of running out of resources. An economy that assumes exponential growth will be thrown into turmoil, and millions^Wbillions will lose their jobs.

Give me a break (1)

mbone (558574) | more than 3 years ago | (#35550102)

Here is a news flash. I have it on good authority that

- eventually Moore's law will fail, and

- the world will continue to roll through the void. Life will go on, and we will not burn our Mac Book Pro's for heat, nor turn our rack-mounted servers into crude dwellings.

Grind to a halt? (1)

therealkevinkretz (1585825) | more than 3 years ago | (#35550140)

Huh? Because computers stop improving exponentially doesn't mean that there won't be continued, and additional, use for those we have. And there are lots of other avenues for innovation for storage, input, output, etc. that might not result in the raw performance gains we've become accustomed to - but will still be innovative and drive research and sales.

Laws of the universe (1)

Zandamesh (1689334) | more than 3 years ago | (#35550190)

I'm not a genius, but, since we don't know all the laws of the universe, how can we possibly make such a prediction? 20 years is a long time, a lot of things will be discovered. But we still won't have flying cars tough...
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>