×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

A Look Back At Kurzweil's Predictions For 2009

kdawson posted more than 5 years ago | from the no-one-expects-the-mule dept.

Books 307

marciot writes "It's interesting to look back at Ray Kurzweil's predictions for 2009 from a decade ago. He was dead on in predicting the ubiquity of portable computers, wireless, the emergence of digital objects, and the rise of privacy concerns. He was a little optimistic in certain areas, predicting the demise of rotating storage and the ubiquity of digital paper a bit earlier than it appears it will actually happen. On the topic of human-computer speech interfaces, though, he seems to be way off." And of course Kurzweil missed 9/11 and the fallout from that. His predictions might have been nearer the mark absent the war on terror.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

307 comments

Civil Liberties (4, Insightful)

Kinky Bass Junk (880011) | more than 5 years ago | (#26339391)

And of course Kurzweil missed 9/11 and the fallout from that. His predictions might have been nearer the mark absent the war on terror.

His prediction on civil liberties might not have been so true if 9/11 never happened.

Automated Telephone Systems (3, Informative)

Anonymous Coward | more than 5 years ago | (#26339437)

Kurzweil may not be as far off on the human-computer speech interfaces as you may first think. It's currently focused in a narrow domain right now: automated telephone systems, which are are all pretty much voice activated these days.

Re:Automated Telephone Systems (4, Interesting)

Khakionion (544166) | more than 5 years ago | (#26339529)

And as for those "image transformers," they're around too, but not so widespread, and they're at the will of the video sender, not the receiver.

http://blog.makezine.com/archive/2006/04/logitech_quickcam_orbit_mp_1.html

Re:Civil Liberties (3, Insightful)

Anonymous Coward | more than 5 years ago | (#26339787)

That is utter bullshit. Civil liberties have been going downhill for a long long time. His prediction on civil liberties was already true before 9/11.

Re:Civil Liberties (2, Funny)

RuBLed (995686) | more than 5 years ago | (#26339927)

His prediction on civil liberties might not have been so true if 9/11 never happened.

Kurzweil had 4chan predicted to counter that.

What a year (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26339393)

Israeli forces are about to pound the fuck out of the Islamic savages. Meanwhile, John Travolta's son dies a tragic scientology-related death.

2009 is here, and it's name is karmic retribution! We shouldn't take out too many of those Islamic zealots because if we did, who would be left to suicide-bomb the hollywood church of scientology? Hmm, travolta...dead son...flies jets...*grin*...hmm, never mind. The situation will remedy itself!

My penis is hard! (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26339413)

Attention all hot women: please form a line.

Re:My penis is hard! (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26339497)

I'm really hot and hard...er, wet, yeah, that's it. Where does the line start?

Re:My penis is hard! (0)

Anonymous Coward | more than 5 years ago | (#26339769)

Spoken like the truly delusion zero who thinks he's a ten.

So, basically (5, Insightful)

Junior J. Junior III (192702) | more than 5 years ago | (#26339421)

Kurzweil has a really good handle on where hardware will be, but not software. What I believe this means is that drives the creation of software is not how quickly it can be developed, but whether there's demand for it.

Demand and innovation are a lot trickier to predict than advances in speed and minitiaturization of electronics hardware, so what we envisioned we thought our future selves might want in 2009 isn't actually quite what it turns out we actually wanted.

Kurzweil thinks speech interface is where it's at, but the world gives us Twitter and Facebook.

Kurzweil wants to use technology to make us immortal or give rise to machines that supercede humankind and take the next evolutionary step as a technological rather than biological one. Meanwhile, people want to make money, get laid, watch stupid video clips, listen to music, and act like their opinion is the best thing there's ever been.

So... Where'll we be in the future? Watch Idiocracy.

Re:So, basically (4, Interesting)

Spaseboy (185521) | more than 5 years ago | (#26339639)

People have never really wanted a speech interface, it's been around FOREVER and has not taken off even when it's quite good.

Re:So, basically (5, Interesting)

Junior J. Junior III (192702) | more than 5 years ago | (#26339715)

Right. Kurzweil thinks they're awesome, in part I believe because he sees it as an incremental stepping stone to developing machines that think. In real life, users get tired after talking for a long time. Imagine how hoarse you'd be if you had to talk to a computer all day long in order to dictate a Word document, launch apps, navigate the interface, etc.

Pointers and keyboards are far more efficient for such tasks. Are there tasks for which a voice interface would be better suited? Perhaps, but I don't think we've seen the applications developed yet that work better with voice than by manual input. Maybe voice-dialing for your cell phone? Nothing else springs to mind.

Would having a conversation with a computer that was capable of understanding conversational english be awesome? I imagine it would be. But what would we talk about? What would I do with such a computer that I couldn't do with my current PC?

Probably a few things would be a lot easier (programming by telling the computer what to do in a natural language rather than having to write objects and procedures in a high-level computer language... Or perhaps gaming applications.

Yeah, that'd be awesome. but that's nowhere near being on the horizon yet, and I don't know that we'll ever get there, because where's the demand for the intermediary steps that would lead us there, and what would those intermediary steps even be??

Hi there sexy... (1)

msimm (580077) | more than 5 years ago | (#26340137)

What would you do? You'd be reprogramming your younger brothers computer to use the voice of Leslie Nielsen for his Talk Sex chatware.

Re:So, basically (4, Insightful)

wrook (134116) | more than 5 years ago | (#26340289)

Probably a few things would be a lot easier (programming by telling the computer what to do in a natural language rather than having to write objects and procedures in a high-level computer language...

Actually, I don't think programming would be any easier at all. We already have people telling programmers what they want in human language (PGMs) and the result is universally horrible. In reality, the hard part about programming is sorting out the nitty gritty details. Transcribing the solution to the computer is not difficult. And I would *not* want to try to discuss these solutions in such detail in natural language.

This is precisely why design documentation tends to go out of date very quickly -- it's written in the wrong language. We can't easily specify the level of detail we require in natural languages and so defer it to programming.

Re:So, basically (0)

Anonymous Coward | more than 5 years ago | (#26340677)

Songs pack a lot more information into them than the words, and our understanding of them may be biological. Based on natural limitations, sight and sound are the fastest ways to communicate. Based on the natural environment, sound is the easiest of these to disambiguate, when there is sensory interference.

I think communicating with a computer naturally doesn't tend towards using ones voice because there's not much chance for interference anyway. The computer is logical and requires simplicity, the user is complex and requires simplicity, too. Additionally, if *speed* is the concern, networked computers can talk to each other much faster than sound travels and we can still instruct the computer to convert this into sound information if we really want to. VoIP is in, but talking to your computer? Not until the computer wants to be talked to, I think.

Re:So, basically (3, Interesting)

Gerzel (240421) | more than 5 years ago | (#26339795)

I think the problem is that developers focus on creating a pure speech interface rather than a mixed one.

Also complicating things is the fact that we already use speech to interface with the world around us, other people telephones and such are often talked to while using the computer keyboard and mouse. How is the computer to know what is a command and what is being spoken to someone else?

You either have to offset spoken commands with some token that won't come up in conversation and normal background speech or you have to give the computer context recognition which is also difficult.

I'd like to bring back a revival of latin. Make all speech control software respond to latin phrases while normal speech is carried out in everyday language. Latin would be ideal because it is dead, and has a focus on commands in its grammatical makeup.

Re:So, basically (5, Funny)

multisync (218450) | more than 5 years ago | (#26339867)

How is the computer to know what is a command and what is being spoken to someone else?

"Computer.

Earl Grey.

Hot."

Re:So, basically (1)

johanatan (1159309) | more than 5 years ago | (#26339893)

But then, how would doctors use it?

Re:So, basically (1)

VagaStorm (691999) | more than 5 years ago | (#26340263)

I belive that in the local hospital here, insted of the doctor making memos that a secretarry types up like in the old days, speach to text is used as a very eficent input on patient journals.

Re:So, basically (1)

johanatan (1159309) | more than 5 years ago | (#26340531)

But, my point was that if we use Latin to send commands to computers, then doctors will inadvertently send commands to theirs because they are already speaking Latin in the professional setting. Granted, the types of words they currently use (anatomy, etc) would probably not overlap with the imperatives.

Re:So, basically (2, Insightful)

anothy (83176) | more than 5 years ago | (#26340107)

the speech thing is really interesting. kurzweil, like the people producing the software, focus on dictation-style system. this is bad for two reasons. first, it's a much harder problem, technically: you need both better signal processing and confidence on your recognition, and you need vastly smarter software to resolve ambiguities based on context (and likely other factors). second, it's a less well-defined use case, which dilutes market demand. saying "use your computer like you do today, but talk to it!" just isn't compelling for most people. the new capabilities should offer new modes of interaction, or new functionality, not just another way to do the same old thing.

Re:Idiocracy (5, Insightful)

Anonymous Coward | more than 5 years ago | (#26339659)

Excellent post. The worst thing too is that techy Internet pundits always bring up the Idiocracy reference, as if only the Internet could walk in a clean white suit above the supposed muck of the idiot masses.

But of course, they all forget their own idiocratic backyard that includes places like 4chan, /b/, and Encyclopedia Dramatica. Or even places like Boing Boing or Youtube, which is a constant barrage of bite-sized irrelevant data for the ADHD crowd. /.'ers don't need to watch Idiocracy. We are living in an Internet Idiocracy that no one cares to improve because of the lulz. Neil Postman's 'Amusing Ourselves to Death' is THE ultimate predictor of the future. We are going to giggle ourselves to death with LOLcats, and people will argue vehemently that it's morally better than any alternative. Like Postman said, we'll beg to stay entertained.

Mod up (0)

DrEasy (559739) | more than 5 years ago | (#26339955)

Of course my mod points had to expire yesterday...

I checked out Postman on Wikipedia (hey I'm no better than the rest of society, I'm easily satisfied with digests!) after reading your post, and I have to agree 100%. Mind you, he wasn't the first to predict the "Society of the Spectacle". Guy Debord did that in the 60s, albeit in a much more cryptic way.

But it indeed seems that we have gone from an opium of the masses (TV nicely putting us in a passive coma state) to the crack-cocaine of the masses (i.e. the internet), where we need non-stop shots of our fix of LOLcat or one-liner IM from a friend during class.

Re:Mod up (5, Insightful)

iocat (572367) | more than 5 years ago | (#26340409)

Have you ever read any real history of people's day to day lives in western societies 50, 100, 200, or hell even 25 years ago? They kind of sucked. There was no real time to ponder the meaning of life because it was a constant struggle to get enough to eat, really. Most history focuses on the societal elites, because they were the only ones doing anything interesting, while everyone else worked too hard to do much other than sleep, eat (hopefully), and work some more, so it's sometimes hard to get perspective on this issue.

Contrast that to today. In Western societies (at least) just about everyone from the bottom of the barrel to the top has plenty of free time -- when they're not scarfing down cheap caloric loads taht would stagger their forebearers -- to surf 4Chan and Something Awful, and play videogames. Yep, when freed from want, it turns out most people go for entertainment.

To which I can only say two things. First -- what the fuck do you expect, we're APES. It's not like there's some special nobility gene waiting to be turned on the second we have computers. And second -- who cares? People who want to do interesting things can still do interesting things (see: universities, make magazine, the people who provide content for the unwashed masses on you tube, the open source movement etc.).

Re:Idiocracy (3, Insightful)

lordvalrole (886029) | more than 5 years ago | (#26340163)

Gracchus: Fear and wonder, a powerful combination.
Gaius: You really think people are going to be seduced by that?
Gracchus: I think he knows what Rome is. Rome is the mob. Conjure magic for them and they'll be distracted. Take away their freedom and still they'll roar. The beating heart of Rome is not the marble of the senate, it's the sand of the coliseum. He'll bring them death - and they will love him for it.
-gladiator

Re:So, basically (1, Insightful)

zappepcs (820751) | more than 5 years ago | (#26339681)

There is a corollary to what you are saying. Demand drives innovation in the consumer market, this much is certain. Can you say betamax vs. VHS? That was not a positive innovative step that was driven by demand, so it works both ways.

What twitter and facebook are... well, technically speaking, they are tweaks to current hardware and understanding of the Internet as a system. One of the key driving factors is that they are reasonably simple to use and users are allowed to 'pimp out' their little space fairly easily. This is something that I've noticed in consumer driven changes. If you can't pimp it out, then people want to have the item that is envied as is. Can you say 'Apple halo effect'?

Computers and software often offer neither of these, or they are not accomplished easily. I've been trying to understand how to apply this to Linux. Any good small business has to have a plan. That plan should include something that sets it apart from every kid in his mom's basement.

I don't think that Idiocracy is where it's at, but rather where the next personal tech that can be pimped out or personalized. I predict (not necessarily in 2009) that computing will make the grade again when a user interface can be pimped out with voice and 3D animations so that the actual experience is nothing like we get with /. or current technology. Some of this can be seen already, but requires a bit more than average hardware to get oohs and aahhhs. When average hardware catches up and the end user experience starts closing in on that experience we viewed on Star Trek (RIP Magel) it will see a resurgence in popularity and development.

I can envision a 3D world not unlike SecondLife that is the end user interface. Documents are in a virtual file cabinet, the little tv is where you launch videos, Perhaps your avatar has a tricorder for surfing the web etc. Who knows exactly, but this virtual world end user experience will make a large difference. Instant messaging will be more like going to visit a friend's house, or meet them in virtual Paris. MySpace will be a small chunck of the 3D world, pimped out for visitors. When surfing the web becomes as interesting as the end user wants to make it, we'll see changes. You and I might prefer some stark spartan setup with FF for browsing with tabs and multiple windows etc. A 16 year old girl might like it to be an Internet full of ponies and glass slippers. Guys might like to decorate the trashcan of their OS with the logo from a football team they despise. There are myriad and as yet unfathomable ways to pimp out the end user experience yet keep them inside a sandbox and away from the important stuff that the neither want to fuck with nor know about. One youngster here in this house would be fine with a user interface or desktop that looks like a hockey rink, and move about in the rink to access 3D objects that opened what you and I call normal applications. He'd pay the NHL $50 bucks for the 'skins' to pimp it 0ut too. Nothing like hearing your team's anthem instead of a drumroll when you log in. Yeah, sure, that can be done now, but it's much more consumer oriented to sell a CD with the install icon and have it all set up for you except for a few tweeks of picking the tune etc.

Well, enough of that. Computers are not made for the throw-away generations. Not yet. When they are, we'll see much more innovation and hi-tech application to low tech processes. Imagine that little girl who loves the computer to be ponies. Her alarm clock is a soft toy pony. She can talk to the pony alarm clock and because it is connected to her computer, the pony can tell her she has a message from Grandma. Read it to me says the girl. The pony does. Tell grandma I love her says the girl. The pony replies to the email. At some point in the future, the near future, none of us will think this is awesome or odd or amazing. It will just be how things are.... or can be.

Re:So, basically (1)

porl (932021) | more than 5 years ago | (#26339933)

so... you see the future as microsoft bob version 2?

hmm.... :D

Re:So, basically (1)

zappepcs (820751) | more than 5 years ago | (#26340057)

No, not even close. I see a future where the end user has to know little if anything about computers to run one, and they are fun, not just fancy typewriters that double as video displays.

I see your point but I'd rather think of it as something like KDE 14.7.3, or as they like to say in Marketing "VR Desktop" or some such nonsense.

The idea that you use a 3D world/space to access applications brings the user into a realm where their natural given manipulations and perceptions make sense rather than having to 'learn about computers' to get anything done.

Re:So, basically (0)

Anonymous Coward | more than 5 years ago | (#26340355)

Her alarm clock is a soft toy pony. She can talk to the pony alarm clock and because it is connected to her computer, the pony can tell her she has a message from Grandma. Read it to me says the girl. The pony does. Tell grandma I love her says the girl. The pony replies to the email.

Am I the only person who sees something verging on VERY WRONG with this image?

Re:So, basically (0)

Anonymous Coward | more than 5 years ago | (#26340387)

I can envision a 3D world not unlike SecondLife...

Hopefully you're not picturing what I am [kurzweilai.net].

Re:So, basically (2, Funny)

hitchhacker (122525) | more than 5 years ago | (#26339719)

So... Where'll we be in the future? Watch Idiocracy.

So like.. in the future... we'll be watching Idiocracy?

-metric

Re:So, basically (0)

Anonymous Coward | more than 5 years ago | (#26339783)

Nah, in the future chicken little prophets will be stating, "In the future people will be watching NatGeo documentaries and reading The Economist! This is our future! Where smart people take our jobs! It's a dystopia!"

Re:So, basically (2, Interesting)

buraianto (841292) | more than 5 years ago | (#26339987)

so what we envisioned we thought our future selves might want in 2009 isn't actually quite what it turns out we actually wanted.

Right. He knows that what we want is what we end up achieving. And I'm sure he knows that he will be wrong on some of his predictions. A large part of what he is doing when he makes these predictions is trying to get people informed about what is possible, to stimulate people's imaginations, so that we will want the things that he thinks are important and good for our future. The goal of making predictions is more than just to be Slashdot fodder ten years from now.

So, Slashdotters, what do you want in ten years?

I Predict... (5, Funny)

Klootzak (824076) | more than 5 years ago | (#26339433)

The following will happen in the next 10 years:

1. Some Terrorist group will blow something up.

2. That people will continue to argue whether Linux is superior to Windows (and vicea versa) on an ideological basis and continue to ignore individual situations/circumstances where their opposing OS would make a better choice.

3. That people will still buy (or not buy) Mac's based on a fashion over function idea (despite the fact the actual Mac offering isn't too bad functionally).

4. That people will make a bunch of random predictions, and several of these will pan out as predicted, and the people will say "Oh Wow!!!", (and then post the original predictions to Slashdot).

Re:I Predict... (4, Funny)

Koiu Lpoi (632570) | more than 5 years ago | (#26339627)

3. That people will still buy (or not buy) Mac's based on a fashion over function idea (despite the fact the actual Mac offering isn't too bad functionally).

Very true. A friend of mine "obtained" the latest beta of Windows 7, and was showing it to me. I pointed out that pinning the items to the taskbar was just like what's been in OSX for a long time now, and he replied (quite seriously), "Yes, but this isn't pretentious."

Re:I Predict... (0, Flamebait)

Fluffeh (1273756) | more than 5 years ago | (#26339793)

and he replied (quite seriously), "Yes, but this isn't pretentious."

Yeah, chalk one up for the non-mac users. That's right, mr fashionable, mr savvy, mr polo neck sweater! Your fashion has caught up with you, and the rest of us rabble have formed an angry mob! Beware of us and run for the hills because we fear your beauty! First it's the cutting remarks like your friend so eloquently put - next... flaming brands and pitchforks!

Re:I Predict... (0)

Anonymous Coward | more than 5 years ago | (#26339637)

3. That people will still buy (or not buy) Mac's based on a fashion over function idea (despite the fact the actual Mac offering isn't too bad functionally).

So they wont buy macs because of blind faith/fanboyism?

4. That people will make a bunch of random predictions, and several of these will pan out as predicted, and the people will say "Oh Wow!!!", (and then post the original predictions to Slashdot).

My predictions aren't random. They're calculated based on Level III Multiverse theory. I predict every possible outcome for every possible event. In one of those universes, all my predictions are correct.

A few of my own predictions...

5. Richard Stallman will get hairier.

6. Google will become more evil.

7. Someone on slashdot will come up with a meme that covers Soviet Russia, car analogies & Profit lists.

Re:I Predict... (1)

AceofSpades19 (1107875) | more than 5 years ago | (#26339695)

3. That people will still buy (or not buy) Mac's based on a fashion over function idea (despite the fact the actual Mac offering isn't too bad functionally).

So they wont buy macs because of blind faith/fanboyism?

4. That people will make a bunch of random predictions, and several of these will pan out as predicted, and the people will say "Oh Wow!!!", (and then post the original predictions to Slashdot).

My predictions aren't random. They're calculated based on Level III Multiverse theory. I predict every possible outcome for every possible event. In one of those universes, all my predictions are correct.

A few of my own predictions...

5. Richard Stallman will get hairier.

6. Google will become more evil.

7. Someone on slashdot will come up with a meme that covers Soviet Russia, car analogies & Profit lists.

I really don't understand why everyone thinks google is evil when they are just doing what every company would do

Re:I Predict... (0)

Anonymous Coward | more than 5 years ago | (#26339991)

I really don't understand why everyone thinks google is evil when they are just doing what every company would do

I can't speak for "everyone" but my guess would be that they believe that corporations are inherently evil. Or maybe it's just large institutions. There are many concepts of evil; "insidious and deliberate" is only one of them. So, when you point out that Google isn't actually plotting to deliberately harm anything, you might be speaking to a definition that the other person isn't using. "Ultimately contributes to disharmony and discord regardless of intentions" is another concept of evil. Maybe that describes the data mining and concomitant loss of privacy that Google and other companies are involved in. "Has too many financial or other potential conflicts of interest to actually represent the kind of neighbors we want them to be" could be yet another. That one might describe the basic idea that corporations must make a profit to survive, so they have motives of their own that may diverge from those of their customers or the community (perfect example: DRM or any sort of vendorlock).

I'm not saying that any of the above actually describes Google. Just that people generally seem to be getting a wee bit tired of always feeling like corporations have more clout in business and politics than the people themselves have as customers and constituents. Maybe that old "what's good for $MULTINATIONAL_CORPORATION is good for America" type of sentiment just isn't good enough to justify the social costs anymore. There's plenty of distrust to go around. This probably isn't such a bad thing. If Google is currently a good, honest company then some suspicion and scrutiny might help to keep them that way.

Re:I Predict... (1)

AceofSpades19 (1107875) | more than 5 years ago | (#26340021)

You can't really get mad at corporations for doing what they have to do to survive. I can see getting mad at them when they delibertly do something that is ilegal, eg. when ms bribed ISO. If no corporations did anything "evil" then most of us would be out of a job because all corps do something that is slightly evil sometime. But I can see your point though

Re:I Predict... (0)

Anonymous Coward | more than 5 years ago | (#26340085)

all companies are evil, that's why it's flippin ridiculous that google says they aren't. Of course, if they didn't say they weren't evil they wouldn't have the potential to be the *evilist* company ever. (they pretty much control the horizontal and the vertical on that shiny new thing called the Internet)

Let me clarify a bit. Generally, companies are legally a sort of identity, which can be used to absorb liability. The company itself has no conscious, no internal dialog to question the broader consequences of it's actions.

It's typically accepted that ~10% of the population is psychopathic (forgive me for not linking). This means that any given political or business construct has a 10% chance of coming under the control of a nutbag whenever power changes hands.

Regardless of the current moral beliefs held at google, the aggregation of so much power could be construed as, at least, socially irresponsible.

Google is the information analogue of the nuclear bomb.

Re:I Predict... (2, Insightful)

MrMr (219533) | more than 5 years ago | (#26340291)

they are just doing what every company would do
Exactly, and being or owning a company is the safest way to be evil and not be punished.
Seriously, what reason is there to 'limit liability' for the owners if not to do things they wouldn't do personally?

Re:I Predict... (2, Interesting)

Kent Recal (714863) | more than 5 years ago | (#26340673)

Seriously, what reason is there to 'limit liability' for the owners if not to do things they wouldn't do personally?

The reason is that otherwise hardly anyone would ever start a company - other than people who already own truckloads of money.

Let's say you have this really great idea for a word processor software. You can build it and bring it to market, no problem. But you know that MS will probably sue you and you can't be sure about the outcome of that. With limited liability you at least know that the worst that can happen is that your company goes out of business and you wasted a few years of your life.
Without limited liability on the other hand... Well, good luck paying off those seven digit "virtual damages" that some court may bill you for.

I Don't Think That Word Means What You Think... (3, Insightful)

Zontar The Mindless (9002) | more than 5 years ago | (#26339463)

...the emergency of digital objects...

Re:I Don't Think That Word Means What You Think... (3, Funny)

marciot (598356) | more than 5 years ago | (#26339665)

...the emergency of digital objects...

I blame it on my continuous speech recognition (CSR) software, which has ubiquitously replaced my keyboard. That, and a lack of artificial intelligence on my part.

Re:I Don't Think That Word Means What You Think... (5, Insightful)

mfnickster (182520) | more than 5 years ago | (#26340041)

Their is nothing Ron with speech wreck ignition. I use it inns Ted of my keyboard awl the thyme.

Will someone shut him up yet? (2, Insightful)

Anonymous Coward | more than 5 years ago | (#26339465)

What? He got like 3 right out of 40.

If you throw enough crap against a wall, some of it will stick.

Kurzweil's 60. At this point, he can't seriously believe that technology is going to keep him alive forever anymore, can he?

Re:Will someone shut him up yet? (4, Insightful)

nprz (1210658) | more than 5 years ago | (#26339717)

Did he even get 3?

This one is obvious:
"Individuals primarily use portable computers, which have become dramatically lighter and thinner than the notebook computers of ten years earlier. "

Am I supposed to think that they just get bigger and bigger after 10 years?

"Computers routinely include wireless technology to plug into the ever-present worldwide network, providing reliable, instantly available, very-high-bandwidth communication."

Wrong, we don't have ever-present worldwide network. Even finding 'hot-spots' are hard.

"Communication between components, such as pointing devices, microphones, displays, printers, and the occasional keyboard, uses short-distance wireless technology."

Mouse/keyboard is about it. Display won't be wireless.

"Government agencies, however, continue to have the right to gain access to people's files, which has resulted in the popularity of unbreakable encryption technologies."

Umm, I guess they still have access if they have a warrant.
I don't see your average person using encryption, let alone 'unbreakable' type.

The only thing he got right is the obvious one. They rest are off. Making a 10-year prediction isn't very fun anyway. 20-year or longer predictions are great, especially if they include flying personal transportation.

Re:Will someone shut him up yet? (4, Insightful)

wurp (51446) | more than 5 years ago | (#26339761)

Er, my phone certainly does have essentially an ever-present connection to the worldwide network. And my phone is a linux machine I would have been proud to have on my desktop 7 years ago.

Re:Will someone shut him up yet? (4, Interesting)

muridae (966931) | more than 5 years ago | (#26339947)

You are right. Just because it doesn't look like what we thought it would look like ten years ago, doesn't mean it isn't happening. To the GP, that unbreakable encryption is available if you want it. Since the government does have access without a warrant, or have you ignored the past few years discussion of warrantless wiretaps, it's been quite common. And, you might use it without knowing it, like SSL for banking?
Devices are all capable of talking to each other, via bluetooth or other means. Contactless smart cards fit as the ID protection on a chip, so do RFID passports, even if they aren't as secure as he had hoped. Memory on portable devices has moved away from the rotating platters. Kindle and other e-books are out there, and while I still prefer the contrast of paper and the lack of DRM, they are popular. Telephones do send high res pictures and video, my 'new' cellphone is capable of both. It's only new to me, the model has been out for some time. And his prediction of dating online/ virtual sex, I think it nicely sums up all the problems of Second Life. As for people preferring to interact with female AI, he's right. Wasn't there an article here about more people choosing the female workout instructor in Wii Fit?
For his predictions of art, I've seen a lot of the things he dreamed up. People are making music with Guitar Hero 'toys', and cooking up strange new instruments with accelerometers. He didn't get it all right, but he was close.

Re:Will someone shut him up yet? (4, Interesting)

Zontar The Mindless (9002) | more than 5 years ago | (#26340125)

Computers routinely include wireless technology to plug into the ever-present worldwide network, providing reliable, instantly available, very-high-bandwidth communication.

Wrong, we don't have ever-present worldwide network. Even finding 'hot-spots' are hard.

I beg to differ. Two weeks ago today, I stood on a beach in Australia — at Hat Head, which, for the curious, is a small and fairly unremarkable seaside town in New South Wales, about 500 km from the nearest large city — where I had no trouble using my Swedish mobile phone/SIM to

  • send a photo I'd just taken of a pelican using my mobile to my girlfriend (who, at the time, was in a small town in Spain that happens to be about as close to the middle of nowhere as you can get and still be on the Iberian Peninsula)
  • upload a photo of my daughter holding a hermit crab she'd just caught to my website, which (last I heard) is hosted in Texas, so my parents (in Florida and North Carolina) could see it (this required popping the memory card out of the camera and into the phone, too bad the camera doesn't support Bluetooth)
  • respond to a text message from a friend of mine who runs a café in Stockholm
  • look up the Swedish words for "pelican" and "hermit crab" in an online dictionary
  • ring a friend of mine in Thailand to let her know I'd had to change my plans and would be returning to Europe via Singapore rather than Bangkok
  • Fired off scripts on my two laptops — one at my ex's place in Kempsey (35 km inland) and the other back in my flat in Stockholm, both using WiFi connections — to update my MySQL server repos and do new builds
  • update my status on Facebook

Now... You were saying something about the lack of world-wide wireless connectivity...? :)

Re:Will someone shut him up yet? (5, Funny)

Bottlemaster (449635) | more than 5 years ago | (#26340303)

send a photo I'd just taken of a pelican using my mobile to my girlfriend (who, at the time, was in a small town in Spain that happens to be about as close to the middle of nowhere as you can get and still be on the Iberian Peninsula)

This is by far the most remarkable. 10 years ago, I don't think anyone guessed that pelicans would be using mobile phones in 2009.

Re:Will someone shut him up yet? (1)

Zontar The Mindless (9002) | more than 5 years ago | (#26340753)

10 years ago, I would never have guessed that I'd receive a Troll mod for a misplaced modifier.

Note to sorry excuse for moderator: "I don't agree" != "Troll".

Re:Will someone shut him up yet? (1)

Repossessed (1117929) | more than 5 years ago | (#26340131)

"Communication between components, such as pointing devices, microphones, displays, printers, and the occasional keyboard, uses short-distance wireless technology."

All of these except displays are common place. And despite the pointlessness, people continue to buy wireless network printers and sit them 3 feet from the router despite this. Technically even displays are possible, you can get short range retransmission devices for televisions (meant to hook cable up to an entire household with one output).

All and all he's very good about predicting technology, and very bad at predicting what people want.

Re:Will someone shut him up yet? (1)

ZombieWomble (893157) | more than 5 years ago | (#26340497)

In reality, he didn't even get the "Things will get smaller" prediction that right. He was qualitatively right, in that "smaller" is the way things go, but he missed by probably an order of magnitude or two - suggesting that people would be wearing dozens of PCs across their body embedded in clothing and jewellery is a rather different view of "personal computing" than someone carrying around a netbook.

Re:Will someone shut him up yet? (0)

Anonymous Coward | more than 5 years ago | (#26340029)

Well, in defense of Kurzweil...

It is now 2009. Individuals primarily use portable computers, which have become dramatically lighter and thinner than the notebook computers of ten years earlier. Personal computers are available in a wide range of sizes and shapes, and are commonly embedded in clothing and jewelry such as wristwatches, rings, earrings, and other body ornaments. >Computers with a high-resolution visual interface range from rings and pins and credit >cards up to the size of a thin book.

Not bad. He was wrong about computers being embedded in jewelry but the rest of the prediction in pretty spot on. The iphone and kindle and other similar devices are pretty damn close to what he was talking about IMHO, and cell phones and other non-PC's ARE more commonly used than traditional PC's these days by a lot of people.

People typically have at least a dozen computers on and around their bodies, which are networked using "body LANs" (local area networks).1 These computers provide communication facilities similar to cellular phones, pagers, and web surfers, monitor body functions, provide automated identity (to conduct financial transactions and allow entry into secure areas), provide directions for navigation, and a variety of other services.

This sounds way off at first glance, but right now I personally carry around 6 "computers" on me on a daily basis. (usb key chain, two different RSA security fobs, iphone, bluetooth headset) and two of them communicate on bluetooth which essentially is a body-lan. I give Kurzweil half-credit on this one, and probably the vision will be mostly accurate by 2015

For the most part, these truly personal computers have no moving parts. Memory is completely electronic, and most portable computers do not have keyboards

Well, depends on how you define a "keyboard". The iphone has a "keyboard" but nothing like a traditional one. He is right about the moving parts, with the exception of laptops which are still using traditional hard drives and optical media, except for the high-end ones using flash memory.

Rotating memories (that is, computer memories that use a rotating platten, such as hard drives, CD-ROMs, and DVDs) are on their way out, although rotating magnetic memories are still used in "server" computers where large amounts of information are stored. Most users have servers in their homes and offices where they keep large stores of digital "objects," including their software, databases, documents, music, movies, and virtual-reality environments (although these are still at an early stage). There are services to keep one's digital objects in central repositories, but most people prefer to keep their private information under their own physical control.

BAM. I'd say he totally nailed this one, as close to correct as this kind of prediction could be.

Cables are disappearing.2 Communication between components, such as pointing devices, microphones, displays, printers, and the occasional keyboard, uses short-distance wireless technology.

Computers routinely include wireless technology to plug into the ever-present worldwide network, providing reliable, instantly available, very-high-bandwidth communication. Digital objects such as books, music albums, movies, and software are rapidly distributed as data files through the wireless network, and typically do not have a physical object associated with them.

This one is mostly correct. Keep in mind that his prediction was for 2009, which gives another whole 12 months. This prediction will look a lot more spot on by next December than it does now IMHO - although even now I'd argue its pretty damn spot on.
      Also note that he said "dissapearing". So he wasn't saying cables would be totally gone by now, just that they'd be on their way out.

The majority of text is created using continuous speech recognition (CSR) dictation software, but keyboards are still used. CSR is very accurate, far more so than the human transcriptionists who were used up until a few years ago.

Also ubiquitous are language user interfaces (LUIs), which combine CSR and natural language understanding. For routine matters, such as simple business transactions and information inquiries, LUIs are quite responsive and precise. They tend to be narrowly focused, however, on specific types of tasks. LUIs are frequently combined with animated personalities. Interacting with an animated personality to conduct a purchase or make a reservation is like talking to a person using videoconferencing, except that the person is simulated.

Ok, he was very overly optimistic here. We do have voice recognition systems that work decently in very narrow areas, and google's new voice search app for the iphone works pretty damn good, but we're nowhere near the point that Kurzweil predicted. His prediction looks better for 2012 than it does for 2009.(google is getting damn close to solving this problem IMHO)
      Also, his prediction that we'd have useful software agents by now was optimistic. Again, I'd say its possible by 2012 but 2009 was optimistic.
In the long run, if we have this tech by 2012 then Kurzweil being off by only 3 or so years wouldn't look that bad.

Computer displays have all the display qualities of paper--high resolution, high contrast, large viewing angle, and no flicker. Books, magazines, and newspapers are now routinely read on displays that are the size of, well, small books.

Well, e-readers HAVE now entered the mainstream(and will be more mainstream as this year progresses) so I'll give him credit for this one

Computer displays built into eyeglasses are also used. These specialized glasses allow users to see the normal visual environment, while creating a virtual image that appears to hover in front of the viewer. The virtual images are created by a tiny laser built into the glasses that projects the images directly onto the user's retinas.3

I think this one will look better by December than it does now. There are several different projects out there that have working Augmented Reality systems at the early prototype level of development. And "display glasses" are right on the cusp of breaking into the mainstream IMHO. I would hold off on judging this prediction until the end of 2009

Computers routinely include moving picture image cameras and are able to reliably identify their owners from their faces.

Well, face recognition IS becoming pretty common but I wouldn't say its exactly routine for computers to identify their owners this way. Again, I'd bet this one is right by 2012 or so.

In terms of circuitry, three-dimensional chips are commonly used, and there is a transition taking place from the older, single-layer chips.

Well, this hasn't been necessary yet because chip manufacturers are still squeezing more transistors into the 2D chips.

Sound producing speakers are being replaced with very small chip-based devices that can place high resolution sound anywhere in three-dimensional space. This technology is based on creating audible frequency sounds from the spectrum created by the interaction of very high frequency tones. As a result, very small speakers can create very robust three-dimensional sound.

Umm, not. This technology is around but it hasn't developed beyond the trade-show gimmick yet, at least not that I have heard of. He fails on this prediction

A $1,000 personal computer (in 1999 dollars) can perform about a trillion calculations per second.4 Supercomputers match at least the hardware capacity of the human brain--20 million billion calculations per second.5 Unused computes on the Internet are being harvested, creating virtual parallel supercomputers with human brain hardware capacity.

The latest GPU's are just now passing the terraflop level and they retail for less than $1000. CPU's are passing into the 30 - 50 gigaflop level and retail for well under $1000.(in fact you could probably buy enough CPU's to reach a terraflop for under $1000 in 1999 dollars).
The largest super computers are now well past the petaflop level. So I'd say Kurzweil was correct on this one and will be even more correct by the end of this year when the next iteraton of GPU's and CPU's will be hitting the market.

There is increasing interest in massively parallel neural nets, genetic algorithms, and other forms of "chaotic" or complexity theory computing, although most computer computations are still done using conventional sequential processing, albeit with some limited parallel processing.

BAM, dead on again.

Research has been initiated on reverse engineering the human brain through both destructive scans of the brains of recently deceased persons as well as noninvasive scans using high resolution magnetic resonance imaging (MRI) of living persons.

Again, bam, spot on. There are now multiple projects ongoing to simulate various parts of the mammallian brain, and the human brain for that matter. This has already led to advances in visual and audio signal processing.

Autonomous nanoengineered machines (that is, machines constructed atom by atom and molecule by molecule) have been demonstrated and include their own computational controls. However, nanoengineering is not yet considered a practical technology.

Again, absolutely dead on. Nailed it.

I'm only going to cover the "computer itself" portion of his prediction since this ran on way longer than I expected. I have to say his prediction were VERY impressive for ten-year prediction. Most of them are either dead on or partially correct. And the ones he is off on will probably come to pass within the next 3 - 5 years.
That is very impressive compared to most of the ten year predictions you'll see out there.

Re:Will someone shut him up yet? (1)

rhyder128k (1051042) | more than 5 years ago | (#26340569)

Spot on. The guy misses far more than he hits. Even when he hits, the innovation existed to some extent ten years ago.

another thing missed (5, Funny)

Anonymous Coward | more than 5 years ago | (#26339491)

And of course he missed the Spanish Inquisition. Possibly he didn't expect that.

Re:another thing missed (0)

Anonymous Coward | more than 5 years ago | (#26339741)

nobody EVER expects the Spanish Inquisition!

captcha Adultery!

Not bad (2, Interesting)

ceoyoyo (59147) | more than 5 years ago | (#26339525)

Quite a bit of that was eerie, when you consider it was written ten years ago. Most decade predictions are way off, with maybe one in ten or twenty hitting near the mark.

problematic economics... (1)

jfruhlinger (470035) | more than 5 years ago | (#26339565)

Despite occasional corrections, the ten years leading up to 2009 have seen continuous economic expansion and prosperity due to the dominance of the knowledge content of products and services. The greatest gains continue to be in the value of the stock market. Price deflation concerned economists in the early '00 years, but they quickly realized it was a good thing. The high-tech community pointed out that significant deflation had existed in the computer hardware and software industries for many years earlier without detriment.

The Dow Jones is currently a bit below where it was in '99; even before the recent crash (which may turn out to be one of these "occasional corrections", who knows) it was only up about 20 percent, which is a pretty poor 10-year investment. But heck, I'll forgive the usual techie stock market triumphalism. What I want to know is, does any sane person think that overall price deflation isn't terrible for the economy? It's crushing to anyone in any significant amount of debt (i.e. anyone who holds a mortgage).

Re:problematic economics... (1)

Kohath (38547) | more than 5 years ago | (#26339581)

What I want to know is, does any sane person think that overall price deflation isn't terrible for the economy?

It's not insanity. It's ignorance.

Re:problematic economics... (1)

Tubal-Cain (1289912) | more than 5 years ago | (#26339723)

What I want to know is, does any sane person think that overall price deflation isn't terrible for the economy?

I'm no economist, but if we have deflation right now I am pretty happy about it. Maybe pennies will once again be worth the metal we put in them.

It's crushing to anyone in any significant amount of debt (i.e. anyone who holds a mortgage).

Don't buy that house that costs so much more than your annual income.

Re:problematic economics... (3, Interesting)

jfruhlinger (470035) | more than 5 years ago | (#26339851)

The problem is that in deflationary periods incomes drop as well as prices, either by direct cuts in salaries or layoffs followed by new jobs that don't pay as well; otherwise everyone would be rich, by magic, which never happens. Thus my family's outstanding mortgage -- currently a fairly reasonable 120 percent or so of our annual income -- would become more and more of a burden.

Re:problematic economics... (1)

wellingj (1030460) | more than 5 years ago | (#26339833)

Given the simple axiom of supply and demand, do you really think that the supply of dollars by the FED [stlouisfed.org] is outstripping the fall in demand caused by the economic downturn? Granted that graph is a reflection of what banks are holding onto, but what do you think is going to happen to the purchasing power of the individual using USD once it all hits the consumer market? The better term for this is Stagflation [wikipedia.org] and there is still a large amount of doubt by this armchair economist if Keynesian Economics can do anything about it.

As for my personal solution, I'll continue to write and design software for a hard product that is designed and manufactured locally in the US and sold globally. And that's the best I can do to help cure stagflation.

He got most of it completely wrong (4, Insightful)

freeweed (309734) | more than 5 years ago | (#26339583)

Most of his predictions that he got right were brain-dead obvious in 1999 - we already had portable computers coming into common use, and cellphones everywhere. This trend was pretty clearly going to continue. Hell, the Gameboy was proof enough that we were about to see a generation who grew up with portable computing. "Body LANs" don't exist in any meaningful form. People at best are wearing the utility belt of gadgets, some of which might talk Bluetooth to each other.

The rest? Wireless? Please. Bluetooth and Wi-Fi were just coming into fruition around that time, and obviously wireless use was going to come into play. Again, cellphones paved the way for this. Beyond that though... I still see millions of wired speakers, mice, keyboards, dvd players, you name it. I still don't see wireless as being the most common form of network access, hell any network admin worth his salt will rant about the general poor performance of Wi-Fi. Wireless printers and displays never really came about (I do find it amusing that he says "occasional keyboard" - the most obvious use of a low-bandwidth wireless interface). His vision of ubiquitous wireless access never came about - the best we have is the cellphone networks, which again, we already had 10 years ago.

Digital books, movies, music? Napster was already out by then. The entertainment industry did its best to stop this from happening and it's only been in the past year or three that it's even been practical (from a legal perspective).

Eyeglass displays have existed for a long, long time and never achieved much success.

A trillion calculations per second on a home computer, eh?

Anyway, just seems a bit underwhelming. He got so much completely wrong.

Re:He got most of it completely wrong (1)

TheKidWho (705796) | more than 5 years ago | (#26339697)

You can get a teraflop of performance from one of the newer Nvidia GPUs in a desktop PC!

Re:He got most of it completely wrong (5, Informative)

marciot (598356) | more than 5 years ago | (#26339713)

A trillion calculations per second on a home computer, eh?

According to wikipedia [wikipedia.org], the ATI Radeon HD4800 series acheives one teraflop. So, I would say Kurzweil was right on the mark on that one.

Re:He got most of it completely wrong (0)

Anonymous Coward | more than 5 years ago | (#26339743)

I'd say a teraflop machine is pretty close to that if you look at some of the GPU technology available now. With Cuda and OpenCL it's becoming easier to actually take advantage of that raw floating point power. Also, look at those NVidia Tesla cards. $1700 gets your right around a TFlop there. Sure it's a little more than $1000, but it's not that far off.

Re:He got most of it completely wrong (2, Informative)

Chapter80 (926879) | more than 5 years ago | (#26339935)

The article says $1000 in 1999 dollars. So that'd be nearly $1500 today. I think he nailed this one.

Not sure what article the GP post read, but I thought it was pretty much spot on, and it was NOT all predictable. I challenge people to find a similar article that was anywhere near this close.

Then again, it doesn't surprise me. Kurzweil is very methodical in his predictions. He works the math.

He appears to have underestimated . . . (2, Insightful)

LuYu (519260) | more than 5 years ago | (#26339615)

. . . the lawyers.

This is surprising since the copyright fanatics spoke much more boldly 10 years ago than they do today.

How much of the truth of his predictions is the result of his predictions?

talking to my computer! No! (0)

Anonymous Coward | more than 5 years ago | (#26339623)

Maybe I've heard too many "you get three wishes" jokes... but I'd rather have an unambiguous, written syntax to control my computer, thank you very much.

Except for the most trivial of tasks, I think talking to your computer is still a long way off. When you type in a command, there is at least a double fail safe in that you have to type the right command, and you have a chance to review it before hitting enter.

English, at least (maybe this is its strong point) is very ambiguous at time. The literal and actual meaning of the spoken word is very slippery. I don't trust the idiot computers we have to do what I mean, given what I say any time soon.

And, by soon, I mean...

Server... dying.... slashdotting... in... progress (0)

Anonymous Coward | more than 5 years ago | (#26339631)

The article seems to be being served by a win98 machine connected to the world by IP over wet string. And the site uses frames..... shudder. So here's the whole thing:

KurzweilAI.net

    Chapter Nine: 2009
by Raymond Kurzweil

        Ever since I could remember, I'd wished I'd been lucky enough to be alive at a great time--when something big was going on, like a crucifixion. And suddenly I realized I was.

        --Ben Shahn

        As we say in the computer business, "shift happens."

        --Tim Romero

It is said that people overestimate what can be accomplished in the short term, and underestimate the changes that will occur in the long term. With the pace of change continuing to accelerate, we can consider even the first decade in the twenty-first century to constitute a long-term view. With that in mind, let us consider the beginning of the next century.
The Computer Itself

It is now 2009. Individuals primarily use portable computers, which have become dramatically lighter and thinner than the notebook computers of ten years earlier. Personal computers are available in a wide range of sizes and shapes, and are commonly embedded in clothing and jewelry such as wristwatches, rings, earrings, and other body ornaments. Computers with a high-resolution visual interface range from rings and pins and credit cards up to the size of a thin book.

People typically have at least a dozen computers on and around their bodies, which are networked using "body LANs" (local area networks).1 These computers provide communication facilities similar to cellular phones, pagers, and web surfers, monitor body functions, provide automated identity (to conduct financial transactions and allow entry into secure areas), provide directions for navigation, and a variety of other services.

For the most part, these truly personal computers have no moving parts. Memory is completely electronic, and most portable computers do not have keyboards.

Rotating memories (that is, computer memories that use a rotating platten, such as hard drives, CD-ROMs, and DVDs) are on their way out, although rotating magnetic memories are still used in "server" computers where large amounts of information are stored. Most users have servers in their homes and offices where they keep large stores of digital "objects," including their software, databases, documents, music, movies, and virtual-reality environments (although these are still at an early stage). There are services to keep one's digital objects in central repositories, but most people prefer to keep their private information under their own physical control.

Cables are disappearing.2 Communication between components, such as pointing devices, microphones, displays, printers, and the occasional keyboard, uses short-distance wireless technology.

Computers routinely include wireless technology to plug into the ever-present worldwide network, providing reliable, instantly available, very-high-bandwidth communication. Digital objects such as books, music albums, movies, and software are rapidly distributed as data files through the wireless network, and typically do not have a physical object associated with them.

The majority of text is created using continuous speech recognition (CSR) dictation software, but keyboards are still used. CSR is very accurate, far more so than the human transcriptionists who were used up until a few years ago.

Also ubiquitous are language user interfaces (LUIs), which combine CSR and natural language understanding. For routine matters, such as simple business transactions and information inquiries, LUIs are quite responsive and precise. They tend to be narrowly focused, however, on specific types of tasks. LUIs are frequently combined with animated personalities. Interacting with an animated personality to conduct a purchase or make a reservation is like talking to a person using videoconferencing, except that the person is simulated.

Computer displays have all the display qualities of paper--high resolution, high contrast, large viewing angle, and no flicker. Books, magazines, and newspapers are now routinely read on displays that are the size of, well, small books.

Computer displays built into eyeglasses are also used. These specialized glasses allow users to see the normal visual environment, while creating a virtual image that appears to hover in front of the viewer. The virtual images are created by a tiny laser built into the glasses that projects the images directly onto the user's retinas.3

Computers routinely include moving picture image cameras and are able to reliably identify their owners from their faces.

In terms of circuitry, three-dimensional chips are commonly used, and there is a transition taking place from the older, single-layer chips.

Sound producing speakers are being replaced with very small chip-based devices that can place high resolution sound anywhere in three-dimensional space. This technology is based on creating audible frequency sounds from the spectrum created by the interaction of very high frequency tones. As a result, very small speakers can create very robust three-dimensional sound.

A $1,000 personal computer (in 1999 dollars) can perform about a trillion calculations per second.4 Supercomputers match at least the hardware capacity of the human brain--20 million billion calculations per second.5 Unused computes on the Internet are being harvested, creating virtual parallel supercomputers with human brain hardware capacity.

There is increasing interest in massively parallel neural nets, genetic algorithms, and other forms of "chaotic" or complexity theory computing, although most computer computations are still done using conventional sequential processing, albeit with some limited parallel processing.

Research has been initiated on reverse engineering the human brain through both destructive scans of the brains of recently deceased persons as well as noninvasive scans using high resolution magnetic resonance imaging (MRI) of living persons.

Autonomous nanoengineered machines (that is, machines constructed atom by atom and molecule by molecule) have been demonstrated and include their own computational controls. However, nanoengineering is not yet considered a practical technology.
Education

In the twentieth century, computers in schools were mostly on the trailing edge, with most effective learning from computers taking place in the home. Now in 2009, while schools are still not on the cutting edge, the profound importance of the computer as a knowledge tool is widely recognized. Computers play a central role in all facets of education, as they do in other spheres of life.

The majority of reading is done on displays, although the "installed base" of paper documents is still formidable. The generation of paper documents is dwindling, however, as the books and other papers of largely twentieth-century vintage are being rapidly scanned and stored. Documents circa 2009 routinely include embedded moving images and sounds.

Students of all ages typically have a computer of their own, which is a thin tabletlike device weighing under a pound with a very high resolution display suitable for reading. Students interact with their computers primarily by voice and by pointing with a device that looks like a pencil. Keyboards still exist, but most textual language is created by speaking. Learning materials are accessed through wireless communication.

Intelligent courseware has emerged as a common means of learning. Recent controversial studies have shown that students can learn basic skills such as reading and math just as readily with interactive learning software as with human teachers, particularly when the ratio of students to human teachers is more than one to one. Although the studies have come under attack, most students and their parents have accepted this notion for years. The traditional mode of a human teacher instructing a group of children is still prevalent, but schools are increasingly relying on software approaches, leaving human teachers to attend primarily to issues of motivation, psychological well-being, and socialization. Many children learn to read on their own using their personal computers before entering grade school.

Preschool and elementary school children routinely read at their intellectual level using print-to-speech reading software until their reading skill level catches up. These print-to-speech reading systems display the full image of documents, and can read the print aloud while highlighting what is being read. Synthetic voices sound fully human. Although some educators expressed concern in the early '00 years that students would rely unduly on reading software, such systems have been readily accepted by children and their parents. Studies have shown that students improve their reading skills by being exposed to synchronized visual and auditory presentations of text.

Learning at a distance (for example, lectures and seminars in which the participants are geographically scattered) is commonplace.

Learning is becoming a significant portion of most jobs. Training and developing new skills is emerging as an ongoing responsibility in most careers, not just an occasional supplement, as the level of skill needed for meaningful employment soars ever higher.
Disabilities

Persons with disabilities are rapidly overcoming their handicaps through the intelligent technology of 2009. Students with reading disabilities routinely ameliorate their disability using print-to-speech reading systems.

Print-to-speech reading machines for the blind are now very small, inexpensive, palm-sized devices that can read books (those that still exist in paper form) and other printed documents, and other real-world text such as signs and displays. These reading systems are equally adept at reading the trillions of electronic documents that are instantly available from the ubiquitous wireless worldwide network.

After decades of ineffective attempts, useful navigation devices have been introduced that can assist blind people in avoiding physical obstacles in their path, and finding their way around, using global positioning system (GPS) technology. A blind person can interact with her personal reading-navigation systems through two-way voice communication, kind of like a Seeing Eye dog that reads and talks.

Deaf persons--or anyone with a hearing impairment--commonly use portable speech-to-text listening machines, which display a real-time transcription of what people are saying. The deaf user has the choice of either reading the transcribed speech as displayed text, or watching an animated person gesturing in sign language. These have eliminated the primary communication handicap associated with deafness. Listening machines can also translate what is being said into another language in real time, so they are commonly used by hearing people as well.

Computer-controlled orthotic devices have been introduced. These "walking machines" enable paraplegic persons to walk and climb stairs. The prosthetic devices are not yet usable by all paraplegic persons, as many physically disabled persons have dysfunctional joints from years of disuse. However, the advent of orthotic walking systems is providing more motivation to have these joints replaced.

There is a growing perception that the primary disabilities of blindness, deafness, and physical impairment do not necessarily impart handicaps. Disabled persons routinely describe their disabilities as mere inconveniences. Intelligent technology has become the great leveler.
Communication

Translating Telephone technology (where you speak in English and your Japanese friend hears you in Japanese, and vice versa) is commonly used for many language pairs. It is a routine capability of an individual's personal computer, which also serves as her phone.

"Telephone" communication is primarily wireless, and routinely includes high-resolution moving images. Meetings of all kinds and sizes routinely take place among geographically separated participants.

There is effective convergence, at least on the hardware and supporting software level, of all media, which exist as digital objects (that is, files) distributed by the ever-present high-bandwidth, wireless information web. Users can instantly download books, magazines, newspapers, television, radio, movies, and other forms of software to their highly portable personal communication devices.

Virtually all communication is digital and encrypted, with public keys available to government authorities. Many individuals and groups, including but not limited to criminal organizations, use an additional layer of virtually unbreakable encryption codes with no third-party keys.

Haptic technologies are emerging that allow people to touch and feel objects and other persons at a distance. These force-feedback devices are widely used in games and in training simulation systems.

Interactive games routinely include all-encompassing visual and auditory environments, but a satisfactory, all-encompassing tactile environment is not yet available. The online chat rooms of the late 1990s have been replaced with virtual environments where you can meet people with full visual realism.

People have sexual experiences at a distance with other persons as well as virtual partners. But the lack of the "surround" tactile environment has thus far kept virtual sex out of the mainstream. Virtual partners are popular as forms of sexual entertainment, but they're more gamelike than real. And phone sex is a lot more popular now that phones routinely include high-resolution, real-time moving images of the person on the other end.
Business and Economics

Despite occasional corrections, the ten years leading up to 2009 have seen continuous economic expansion and prosperity due to the dominance of the knowledge content of products and services. The greatest gains continue to be in the value of the stock market. Price deflation concerned economists in the early '00 years, but they quickly realized it was a good thing. The high-tech community pointed out that significant deflation had existed in the computer hardware and software industries for many years earlier without detriment.

The United States continues to be the economic leader due to its primacy in popular culture and its entrepreneurial environment. Since information markets are largely world markets, the United States has benefited greatly from its immigrant history. Being comprised of all the world's peoples--specifically the descendants of peoples from around the globe who had endured great risk for a better life--is the ideal heritage for the new knowledge-based economy. China has also emerged as a powerful economic player. Europe

is several years ahead of Japan and Korea in adopting the American emphasis on venture capital, employee stock options, and tax policies that encourage entrepreneurship, although these practices have become popular throughout the world.

At least half of all transactions are conducted online.

Intelligent assistants which combine continuous speech recognition, natural-language understanding, problem solving, and animated personalities routinely assist with finding information, answering questions, and conducting transactions. Intelligent assistants have become a primary interface for interacting with information-based services, with a wide range of choices available. A recent poll shows that both male and female users prefer female personalities for their computer-based intelligent assistants. The two most popular are Maggie, who claims to be a waitress in a Harvard Square café, and Michelle, a stripper from New Orleans. Personality designers are in demand, and the field constitutes a growth area in software development.

Most purchases of books, musical "albums," videos, games, and other forms of software do not involve any physical object, so new business models for distributing these forms of information have emerged. One shops for these information objects by "strolling" through virtual malls, sampling and selecting objects of interest, rapidly (and securely) conducting an online transaction, and then quickly downloading the information using high-speed wireless communication. There are many types and gradations of transactions to gain access to these products. You can "buy" a book, musical album, video, etcetera, which gives you unlimited permanent access. Alternatively, you can rent access to read, view, or listen once, or a few times. Or you can rent access by the minute. Access may be limited to one person or to a group of persons (for example, a family or a company). Alternatively, access may be limited to a particular computer, or to any computer accessed by a particular person or by a set of persons.

There is a strong trend toward the geographic separation of work groups. People are successfully working together despite living and working in different places.

The average household has more than a hundred computers, most of which are embedded in appliances and built-in communication systems. Household robots have emerged, but are not yet fully accepted.

Intelligent roads are in use, primarily for long-distance travel. Once your car's computer guidance system locks onto the control sensors on one of these highways, you can sit back and relax. Local roads, though, are still predominantly conventional.

A company west of the Mississippi and north of the Mason-Dixon line has surpassed a trillion dollars in market capitalization.
Politics and Society

Privacy has emerged as a primary political issue. The virtually constant use of electronic communication technologies is leaving a highly detailed trail of every person's every move. Litigation, of which there has been a great deal, has placed some constraints on the widespread distribution of personal data. Government agencies, however, continue to have the right to gain access to people's files, which has resulted in the popularity of unbreakable encryption technologies.

There is a growing neo-Luddite movement, as the skill ladder continues to accelerate upward. As with earlier Luddite movements, its influence is limited by the level of prosperity made possible by new technology. The movement does succeed in establishing continuing education as a primary right associated with employment.

There is continuing concern with an underclass that the skill ladder has left far behind. The size of the underclass appears to be stable, however. Although not politically popular, the underclass is politically neutralized through public assistance and the generally high level of affluence.
The Arts

The high quality of computer screens, and the facilities of computer-assisted visual rendering software, have made the computer screen a medium of choice for visual art. Most visual art is the result of a collaboration between human artists and their intelligent art software. Virtual paintings--high-resolution wall-hung displays--have become popular. Rather than always displaying the same work of art, as with a conventional painting or poster, these virtual paintings can change the displayed work at the user's verbal command, or can cycle through collections of art. The displayed artwork can be works by human artists or original art created in real time by cybernetic art software.

Human musicians routinely jam with cybernetic musicians. The creation of music has become available to persons who are not musicians. Creating music does not necessarily require the fine motor coordination of using traditional controllers. Cybernetic music creation systems allow people who appreciate music but who are not knowledgeable about music theory and practice to create music in collaboration with their automatic composition software. Interactive brain-generated music, which creates a resonance between the user's brain waves and the music being listened to, is another popular genre.

Musicians commonly use electronic controllers that emulate the playing style of the old acoustic instruments (for example, piano, guitar, violin, drums), but there is a surge of interest in the new "air" controllers in which you create music by moving your hands, feet, mouth, and other body parts. Other music controllers involve interacting with specially designed devices.

Writers use voice-activated word processing; grammar checkers are now actually useful; and distribution of written documents from articles to books typically does not involve paper and ink. Style improvement and automatic editing software is widely used to improve the quality of writing. Language translation software is also widely used to translate written works in a variety of languages. Nonetheless, the core process of creating written language is less affected by intelligent software technologies than the visual and musical arts. However, "cybernetic" authors are emerging.

Beyond music recordings, images, and movie videos, the most popular type of digital entertainment object is virtual experience software. These interactive virtual environments allow you to go whitewater rafting on virtual rivers, to hang-glide in a virtual Grand Canyon, or to engage in intimate encounters with your favorite movie star. Users also experience fantasy environments with no counterpart in the physical world. The visual and auditory experience of virtual reality is compelling, but tactile interaction is still limited.
Warfare

The security of computation and communication is the primary focus of the U.S. Department of Defense. There is general recognition that the side that can maintain the integrity of its computational resources will dominate the battlefield.

Humans are generally far removed from the scene of battle. Warfare is dominated by unmanned intelligent airborne devices. Many of these flying weapons are the size of small birds, or smaller.

The United States continues to be the world's dominant military power, which is largely accepted by the rest of the world, as most countries concentrate on economic competition. Military conflicts between nations are rare, and most conflicts are between nations and smaller bands of terrorists. The greatest threat to national security comes from bioengineered weapons.
Health and Medicine

Bioengineered treatments have reduced the toll from cancer, heart disease, and a variety of other health problems. Significant progress is being made in understanding the information processing basis of disease.

Telemedicine is widely used. Physicians can examine patients using visual, auditory, and haptic examination from a distance. Health clinics with relatively inexpensive equipment and a single technician bring health care to remote areas where doctors had previously been scarce.

Computer-based pattern recognition is routinely used to interpret imaging data and other diagnostic procedures. The use of noninvasive imaging technologies has substantially increased. Diagnosis almost always involves collaboration between a human physician and a pattern-recognition-based expert system.

Doctors routinely consult knowledge-based systems (generally through two-way voice communication augmented by visual displays), which provide automated guidance, access to the most recent medical research, and practice guidelines.

Lifetime patient records are maintained in computer databases. Privacy concerns about access to these records (as with many other databases of personal information) have emerged as a major issue.

Doctors routinely train in virtual reality environments, which include a haptic interface. These systems simulate the visual, auditory, and tactile experience of medical procedures, including surgery. Simulated patients are available for continuing medical education, for medical students, and for people who just want to play doctor.
Philosophy

There is renewed interest in the Turing Test, first proposed by Alan Turing in 1950 as a means for testing intelligence in a machine. Recall that the Turing Test contemplates a situation in which a human judge interviews the computer and a human "foil," communicating with both over terminal lines. If the human judge is unable to tell which interviewee is human and which is machine, the machine is deemed to possess human-level intelligence. Although computers still fail the test, confidence is increasing that they will be in a position to pass it within another one or two decades.

There is serious speculation on the potential sentience (that is, consciousness) of computer-based intelligence. The increasingly apparent intelligence of computers has spurred an interest in philosophy.

. . . Hey, Molly.

Oh, so you're calling me now.

Well, the chapter was over and I didn't hear from you.

I'm sorry, I was finishing up a phone call with my fiancé.

Hey, congratulations, that's great. How long have you known . . .

Ben, his name is Ben. We met about ten years ago, just after you finished

this book.

I see. So how have I done?

You did manage to sell a few copies.

No, I mean with my predictions.

Not very well. The translating telephones, for one thing, are a little ridiculous. I mean, they're constantly screwing up.

Sounds like you use them, though?

Well, sure, how else am I going to speak to my fiancé's father in Ieper, Belgium, when he hasn't bothered to learn English?

Of course. So what else?

You said that cancer was reduced, but that's actually quite understated. Bioengineered treatments, particularly antiangiogenesis drugs that prevent tumors from growing the capillaries they need, have eliminated most forms of cancer as a major killer.6

Well, that's just not a prediction I was willing to make. There have been so many false hopes with regard to cancer treatments, and so many promising approaches proving to be dead ends, that I just wasn't willing to make that call. Also, there just wasn't enough evidence when I wrote the book in 1998 to make that dramatic a prediction.

Not that you shied away from dramatic predictions.

The predictions I made were fairly conservative, actually, and were based on technologies and trends I could touch and feel. I was certainly aware of several promising approaches to bioengineered cancer treatments, but it was still kind of iffy, given the history of cancer research. Anyway, the book only touched tangentially on bioengineering, although it's clearly an information-based technology.

Now with regard to sex--

Speaking of health problems . . .

Yes, well, you said that virtual partners were popular, but I just don't see that.

It might just be the circle you move in.

I have a very small circle--mostly I've been trying to get Ben to focus on our wedding.

Yes, tell me about him.

He's very romantic. He actually sends me letters on paper!

That is romantic. So, how was the phone call I interrupted?

I tried on this new nightgown he sent me. I thought he'd appreciate it, but he was being a little annoying.

I assume you're going to finish that thought.

Well, he wanted me to kind of let these straps slip, maybe just a little. But I'm kind of shy on the phone. I don't really go in for video phone sex, not like some friends I know.

Oh, so I did get that prediction right.

Anyway, I just told him to use the image transformers.

Transformers?

You know, he can undress me just at his end.

Oh yes, of course. The computer is altering your image in real time.

Exactly. You can change someone's face, body, clothing, or surroundings into someone or something else entirely, and they don't know you're doing it.

Hmmm.

Anyway, I caught Ben undressing his old girlfriend when she called to congratulate him on our engagement. She had no idea, and he thought it was harmless. I didn't speak to him for a week.

Well, as long as it was just at his end.

Who knows what she was doing at her end.

That's kind of her business, isn't it? As long as they don't know what the other is doing.

I'm not so sure they didn't know. Anyway, people do spend a lot of time together up close but at a distance, if you know what I mean.

Using the displays?

We call them portals--you can look through them, but you can't touch.

I see, still no interest in virtual sex?

Not personally. I mean, it's pretty pathetic. But I did have to write the copy for a brochure about a sensual virtual reality environment. Being low on the totem pole, I really can't pick my assignments.

Did you try the product?

I didn't exactly try it. I just observed. I would say they put more effort into the virtual girls than the guys.

How'd your campaign make out?

The product bombed. I mean, the market's just so cluttered.

You can't win them all.

No, but one of your predictions did work out quite well. I took your advice about that company north of the Mason-Dixon line. And, hey, I'm not complaining.

I'll bet a lot of stocks are up.

Yes, the boats keep getting higher.

Okay, what else?

You're right about the disabled. My office mate is deaf, and it's not an issue at all. There's nothing important a blind or deaf person can't do today.

That was really true back in 1999.

I think the difference now is that the public understands it. It's just a lot more obvious with today's technology. But that understanding is important.

Sure, without the technology, there's just a lot of misconception and prejudice.

True enough. I think I'm going to have to get going, I can see Ben's face on my call line.

He looks like a St. Bernard.

Oh, I left my image transformers on. Here, I'll let you see what he really looks like.

Hey, good-looking guy. Well, good luck. You do seem to have changed.

I should hope so.

I mean I think our relationship has changed.

Well, I'm ten years older.

And it seems that I'm asking you most of the questions.

I guess I'm the expert now. I can just tell you what I see. But how come you're still stuck in 1999?

I'm afraid I just can't leave quite yet. I have to get this book out, for one thing.

I do have one confusion. How is it that you can talk to me from 1999 when I'm here in the year 2009? What kind of technology is that?

Oh, that's a very old technology. It's called poetic license.

Originally published in The Age of Spiritual Machines (C)1999 Raymond Kurzweil

Funniest line goes to... (5, Insightful)

Dutch Gun (899105) | more than 5 years ago | (#26339649)

Style improvement and automatic editing software is widely used to improve the quality of writing."

So close [nwsource.com], and yet [xkcd.com] so, so far... [xkcd.com]

Most all the predictions I read in this article have roughly the same problem - it still assumes technology is much more ubiquitous than it is in the real world. I'd say he was probably off by a five to ten years in many of those predictions. Let's see:

Computers: Personal computers are available in a wide range of sizes and shapes, and are commonly embedded in clothing and jewelry such as wristwatches, rings, earrings, and other body ornaments... The majority of text is created using continuous speech recognition (CSR) dictation software.

Getting there, but we're not quite at the point of wearing computers in common objects. Keyboard and mouse are still king.

Education: Students of all ages typically have a computer of their own, which is a thin tabletlike device weighing under a pound with a very high resolution display suitable for reading... Intelligent courseware has emerged as a common means of learning.

Closer, but education still seems largely clueless about how to effectively use computers. Intelligent teaching software is making strides, but still really can't be called "intelligent" by any stretch of the imagination.

Communication: "Telephone" communication is primarily wireless, and routinely includes high-resolution moving images... Virtually all communication is digital and encrypted, with public keys available to government authorities.

Technologists always want that video phone, and the market continually says "no thanks, voice is good enough". In fact, it's gone backwards a bit, with text messaging being rather popular.

Business and Economics: Intelligent assistants which combine continuous speech recognition, natural-language understanding, problem solving, and animated personalities routinely assist with finding information, answering questions, and conducting transactions... Most purchases of books, musical "albums," videos, games, and other forms of software do not involve any physical object.

Again, the overestimation of natural interfaces. And as of right now, a large percentage of software (especially games) is still attached to a physical disk, although digital downloads are gaining Steam... (sorry)

Politics and Society: Privacy has emerged as a primary political issue. The virtually constant use of electronic communication technologies is leaving a highly detailed trail of every person's every move.... There is a growing neo-Luddite movement...

This one's pretty close regarding privacy concerns. As far as neo-Luddite, I haven't seen any such movement emerge in large numbers. There are some anti-technologists, but it's usually a secondary effect of some other philosophical argument.

The Arts: The high quality of computer screens, and the facilities of computer-assisted visual rendering software, have made the computer screen a medium of choice for visual art.

Another one technologists always get wrong is the idea that people are eager to throw away traditional art mediums. I think Star Trek was closer on this one, about how people will always enjoy timeless "classical" entertainment right alongside their "high-tech" (holodeck) entertainment. The two need not be mutually exclusive.

Etc, etc... I'd say the predictions were generally on the right track, but perhaps just a bit too optimistic in the rate of adoption. Still, overall it was fairly insightful, if somewhat conservative. I'm not sure I could have done nearly as well.

Re:Funniest line goes to... (2, Insightful)

Anonymous Coward | more than 5 years ago | (#26340151)

"Communication: "Telephone" communication is primarily wireless, and routinely includes high-resolution moving images... Virtually all communication is digital and encrypted, with public keys available to government authorities.

Technologists always want that video phone, and the market continually says "no thanks, voice is good enough". In fact, it's gone backwards a bit, with text messaging being rather popular."

I think you're missing the reason. It's not that people prefer text messaging. I mean, look at Skype - I'm pretty sure people make heavy use of video calls (I know people in my family do).

I think it's a bit of a UI issue, a location issue, and a penetration issue. Video phones generally will be camera phones - and it's difficult to imagine a UI where the lens can be shared by both and used in both modes successfully.

Mobile phones are used everywhere. Video phones require full concentration on the part of both members, which usually isn't practical on the go. Nor do people generally want that huge an interruption - text messages are far more discrete (both sending & receiving) and more private (others around you generally don't know what you've texted).

Additionally, there's unwillingness on the part of some cell providers I think because it'll increase data usage without an obvious rise in value. They're being stupid and full integration with the major players (i.e. Skype - although this may be more difficult, Google, MSN, & Yahoo) would expand the value of their network significantly (Metcalfe's law and all) and wouldn't even need immediate extensive support from other cell companies or phone manufacturers (although they should define the standard so that entry would be minimal for new players - which obviously they would never do, even though it would be in their interest).

I hope video phones do get here, but I'm guessing there's several important factors here, and the main one really being the social aspect.

Re:Funniest line goes to... (2, Insightful)

SwedishPenguin (1035756) | more than 5 years ago | (#26340243)

Technologists always want that video phone, and the market continually says "no thanks, voice is good enough". In fact, it's gone backwards a bit, with text messaging being rather popular.

Wireless video phones are widely available today and have been for years, even my (pretty cheap) phone has that feature. I've never seen anyone use it though, and I've never used it myself. It seemed like a really cool idea when seen in SciFi movies/tv shows, but in reality it's just isn't all that necessary to see the person you're speaking to, especially when on the move as you are with your cellphone.

Re:Funniest line goes to... (1)

Lars512 (957723) | more than 5 years ago | (#26340707)

Exactly. People only seem to use video calls on their computers, when they're stationary, for example when skype calling one-another.

Kreskin said it best (0)

Anonymous Coward | more than 5 years ago | (#26339655)

"BSD is Dying."

  1. Kreskin said it. [amazingkreskin.com]
  2. I believe it.
  3. That settles it.

Re:Kreskin said it best (0)

Anonymous Coward | more than 5 years ago | (#26340187)

did Netcraft confirm it?

Not that Kurtzweil? (1)

bakedpatato (1254274) | more than 5 years ago | (#26339733)

I blame RSS. I subscribe to music industry feeds in addition to /., and I thought it was plausible that Kurtzweil(the company) would predict lower sales or something like that for 2k9.

Pretty close though (2, Insightful)

TheSync (5291) | more than 5 years ago | (#26339809)

I was sitting next to someone with a Kindle on a plane last week, so the digital paper thing is moving fast.

Rotational storage is not going away anytime soon (who though we'd have Terabyte drives?), but you certainly my iPhone can do a heck of a lot of computing with just Flash.

Not right about much that's important (2, Insightful)

Al Dimond (792444) | more than 5 years ago | (#26339913)

The things he was right about were fields where the path forward was pretty certain. We had a pretty good idea then how we'd make microchips smaller and faster, a clear path forward. Only now is that path getting clouded by physical limits. Where he was wrong was in predicting steady, linear progress in areas where there isn't a clear path forward. This includes AI, interface design, economics, and general welfare (I just love his dismissal of the underclass; they're a pretty big portion of humanity, you know, and I don't think the human story can be truly told without theirs as well).

I have a better track record than he does. (2, Interesting)

Animats (122034) | more than 5 years ago | (#26340091)

I run Downside [downside.com], where, in 2000, I called the dot-com crash before it happened and named names. Check my track record. Since then, I've occasionally pointed out the obvious before it became conventional wisdom:

  • 2004-10-11 - The coming mortgage crunch
    The next crash looks to be housing-related. Fannie Mae is in trouble. But not because of their accounting irregularities. The problem is more fundamental. They borrow short, lend long, and paper over the resulting interest rate risk with derivatives. In a credit crunch, the counterparties will be squeezed hard. The numbers are huge. And there's no public record of who those counterparties are.
    Derivatives allow the creation of securities with a low probability of loss coupled with a very high but unlikely loss. When unlikely events are uncorrelated, as with domestic fire insurance, this is a viable model. When unlikely events are correlated, as with interest rate risk, everything breaks at once. Remember "portfolio insurance"? Same problem.
    Mortgage financing is so tied to public policy that predictions based on fundamentals are not possible. All we can do is to point out that huge stresses are accumulating in that sector. At some point, as interest rates increase, something will break in a big way. The result may look like the 1980s S&L debacle.
  • 2006-01-01 - Predictions for 2006
    • Saudi Arabia finally admits the Gawar field has peaked. Oil passes $70 per barrel.
    • US interest rate spike. "Homeowners" with adjustable-rate interest-only loans default and are foreclosed. Housing prices crash as foreclosures glut market..
    • Nobody wins in Iraq. Neither side can force a decision, so both sides keep bleeding.
    • One of the big three US car manufacturers goes bankrupt.
    • A major hurricane wipes out another southern US city.

The 2004 prediction describes exactly what happened in housing. No question about that.

The 2006 predictions took longer to happen than I'd expected. The Fed cut rates sharply in 2007, accelerating the economy when it should have been hitting the brakes. This deferred the collapse of the housing bubble, but not for long. When it did pop, it was worse than it had to be.

I expected one of the car manufacturers to go bust. Instead, they all almost went bust, and only a Government bailout saved them. The fundamentals indicated something had to give. The housing bubble and interest rate cuts resulted in something of a "car bubble", deferring the inevitable a few more more years.

The hurricane prediction was kind of off the wall, but Galveston was duly flattened.

It's nice to be right, but it isn't happy-making.

Re:I have a better track record than he does. (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26340287)

Chrysler has almost gone bankrupt in every one of the four U.S. economic downturns in the last thirty years (well, it couldn't actually go under on its own in the dot-com crash, but it cost Daimler a lot of money). Predicting a serious economic problem and predicting an auto company will go under is like, well, predicting somebody will be hit in the head with a bullet and predicting that same person or one of his two friends will die. You're automatically going to be close to right on the second prediction if you're right on the first.

Shyster (0)

Anonymous Coward | more than 5 years ago | (#26340119)

Kurzweil is a fucking idiot. And let this prompt any of his defenders to list his amazing achievements.

Bored. (1)

Zenne (1013871) | more than 5 years ago | (#26340141)

That was a boring read. It feels like - mind, I don't really know anything about the author - he picked a lot of different subjects, magnified them all, and happened to be right on a few things because they followed through to their natural conclusion. It was reminiscent of flipping through college textbooks from the 80s - silly predictions mixed in with ones that happened. I'm so surprised.

Bah! Humbug! (2, Insightful)

binpajama (1213342) | more than 5 years ago | (#26340165)

The biggest problem with Kurzweil's view of the world is that it assumes that any innovation, if technologically feasible, is going to be adopted. As a simple example, the issue of voice-to-voice translation that he raises in the article. Its just more economical and practical to do business with someone who knows English (or has easy access to someone who knows English)

Similar wishful thinking by Sci Fi doyens caused visions of space colonies and interstellar travel by the first decade of the 21st century or soon afterwards (e.g. 2001:A space odyssey) back in the 60s and 70s when the edges of the universe seemed to be be just another Project Manhattan away. We all know how that has turned out.

Yes, there are lots of cool things that technology can produce. It will produce them for a population, however, that is more concerned with surviving on a decreasing resource base than the pursuit of techno-Utopia. Just because a small population of geeks in the US can afford and enjoy playing with gizmos doesn't mean the technology is pervasive in the `world'. Yes, computational power increases with time, and that can be channeled into all kinds of innovation, which is the gist of what Kurzweil is saying. That increase in computational power has limited scalability, however, unless you are assuming that all the world is concerned about is playing PC games, downloading music and watching videos online. [Note: By world, I mean the world outside /. Yes I can prove it exists!]

I think Kurzweil is going to be increasingly disappointed in the coming decades.

Yeah, not so much :p (1)

VagaStorm (691999) | more than 5 years ago | (#26340229)

"There are services to keep one's digital objects in central repositories, but most people prefer to keep their private information under their own physical control." Bet google is glad that never hapened :p

Desperate for Singularity (5, Insightful)

Sepiraph (1162995) | more than 5 years ago | (#26340241)

I think Kurzweil is desperate for Singularity to happen sooner because frankly he just doesn't want to die.

Re:Desperate for Singularity (1)

Lars512 (957723) | more than 5 years ago | (#26340745)

Maybe he just wants to live long enough to see it begin. Then again, it changes the ball game and seems a shame to die when it seems likely that one day people won't have to (for example, through digitising themselves).

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...