×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

The Future of Computing

samzenpus posted more than 8 years ago | from the put-on-your-time-helmet dept.

182

webglee writes "What will the relationship between computing and science bring us over the next 15 years? That is the topic addressed by the web focus special of Nature magazine on the Future of Computing in Science. Amazingly, all the articles are free access, including a commentary by Vernor Vinge titled 2020 Computing: The creativity machine."

Sorry! There are no comments related to the filter you selected.

Don't underestimate... (4, Interesting)

JDSalinger (911918) | more than 8 years ago | (#14980245)

It is easy to understimate the speed at which technology is changing. Pending brick walls (insurmountable laws of physics), computing in 2020 should be absurdly different from that of today.
According to Ray Kurzweil: "An analysis of the history of technology shows that technological change is exponential, contrary to the common-sense "intuitive linear" view. So we won't experience 100 years of progress in the 21st century -- it will be more like 20,000 years of progress (at today's rate). The "returns," such as chip speed and cost-effectiveness, also increase exponentially. There's even exponential growth in the rate of exponential growth. Within a few decades, machine intelligence will surpass human intelligence, leading to The Singularity -- technological change so rapid and profound it represents a rupture in the fabric of human history. The implications include the merger of biological and nonbiological intelligence, immortal software-based humans, and ultra-high levels of intelligence that expand outward in the universe at the speed of light."

Re:Don't underestimate... (-1)

Anonymous Coward | more than 8 years ago | (#14980301)

Ok, just so that the math people don't all fall dead on this one, exponential growth rate of exponential growth is still just exponential growth (with a bigger factor).

Re:Don't underestimate... (2, Funny)

Anonymous Coward | more than 8 years ago | (#14980314)

It is easy to understimate the speed at which technology is changing. Pending brick walls (insurmountable laws of physics), computing in 2020 should be absurdly different from that of today.

No kidding - by 2020 we should just be able to start playing Duke Nukem Forever in Windows Vista.

Re:Don't underestimate... (1)

Half a dent (952274) | more than 8 years ago | (#14980318)

I have no doubt that advances will be huge but this should be tempered by the "accuracy" of past predictions.

I am still waiting for my flying car and personal jetpack promised in the fifties. Cheap re-usable spacecraft. Flights from New York to Tokyo in under 3 hours. Cure for cancer anyone?

I am keeping my fingers crossed of course but not holding my breath.

Re:Don't underestimate... (3, Insightful)

vertinox (846076) | more than 8 years ago | (#14980502)

Well we have flying cars [com.com] , but I doubt we will see them take off... Errr... No pun intended.

The reason we don't have flying cars today is the highest unnatural cause of death in the United States is car accidents. Could you imagine what would happen if a drunk driver go into a vehicle that could fly 10,000 ft at 300mph into a building or other cars?

So flying cars and jet packs aren't a reality because of humans inability to control moving vehicles with 100% no-accident rate. Once we have pure AI driving our cars it might be more feasible, but we are looking at 2020 at the earliest.

Re:Don't underestimate... (3, Insightful)

utexaspunk (527541) | more than 8 years ago | (#14981251)

Once we have pure AI driving our cars it might be more feasible, but we are looking at 2020 at the earliest.

Even at that point, it seems unlikely. If we'll have flying cars that drive themselves, we'll most likely have normal cars that drive themselves. If we have normal cars that drive themselves, most of the problems that we think flying cars will solve would be moot- no more traffic jams, higher speed limits, no stop lights, etc. Since we already have the infrastructure for 2-D travel, and since flying cars would likely use more energy (you're using a good portion of your energy to fight gravity instead of move forward), and since any failure of a flying car is a lot more likely to result in a death, I think it will be a lot longer than that, if it ever happens at all.

Re:Don't underestimate... (1)

TheRaven64 (641858) | more than 8 years ago | (#14981659)

It is quite a lot easier to design an autopilot for a flying car than a land based one, if your flying car can VTOL. A flying car just needs to go upwards for a bit and then move in a straight line towards the target avoiding obstacles (which can be detected using RADAR) and then land. A ground car needs to be able to recognise the road, which is a non-trivial computer vision problem, particularly on country roads. It also needs to be able to recognise road signs and signals (for the first few generations, at least).

Re:Don't underestimate... (1)

booch (4157) | more than 8 years ago | (#14982006)

Flying cars would still be quite a bit faster, as you can travel in a straight line, instead of following windy roads. I believe aerodynamic issues will also limit the speed of ground-based transportation. Even the fastest street cars today only go up to about 200 mph. The slowest planes typically travel at about 100 mph.

Re:Don't underestimate... (1)

Pneuma ROCKS (906002) | more than 8 years ago | (#14982104)

It would also make certain types or terrorist acts much easier and traffic accidents much more catastrophic. Pile up? Try pile down. Even with AI I don't think accidents or intentional tampering can be completely avoided. Flying cars is a neat idea on paper and on TV. Let's leave them there.

Don't overestimate... (5, Insightful)

AKAImBatman (238306) | more than 8 years ago | (#14980321)

Remember how all the SciFi shows of the 60's thought that we'd be cruising the solar system (perhaps even the stars!) by the year 2000? The Jupiter II optimistically took off in 1999, and Star Trek contained several references to "Eugenics Wars" and "early space travellers" that were supposed to have happened by now.

What do we actually have? The same space shuttle that's been flying since the late 70's, and updates to the same rockets that have existed throughout the history of the space program.

Technology does progress at an exponential rate. The only problem is that the focus of technology moves. Computers have already gone through several booms of massive technology increase, and are now very stable creations. There's just as good of a chance that they'll continue to update in a more linear fashion (ala automobiles) as there is that they'll experience exponential increases in technological sophistication. I personally find it more likely that technology will begin to focus on improving other areas for the time being, and allow computers to remain stable for the time being.

So be careful not to severely overestimate while you're attempting to avoid underestimation.

Re:Decentrialization is key. (4, Insightful)

vertinox (846076) | more than 8 years ago | (#14980490)

Might I point out that more money is probaly put into the cell phone, telcom, and computers industry than all the world's space programs combined.

The reason we aren't seeing great advancments in our space and nuclear programs is that they are highly centralized and are at the whim of select few if they get funding or not.

However, when technology is decentralized... As in everyone can have a cell phone, broadband, and a computer within their means then those types of technology will advance faster at an accelerating rate. (I hope I don't sound like Kurzweil).

Not everyone can go to the moon... But most everyone in the western world can have an Xbox360. May not mean everyone is going to get one... But more than enough to cause rampant R&D into that industry.

Trust me... I'm shocked myself. I remember a time when we didn't have cell phones, computers with hard drives (I miss my old IBM pc jr), internet, 4-7 channel TVs, and every thing else that is happening now... And I'm only 27.

Things are happening at an accelerating pace... Short of a world disaster or economic depression lik ethe 1930's I doubt we will see a slow down.

Re:Decentrialization is key. (1, Insightful)

Anonymous Coward | more than 8 years ago | (#14980746)

You nailed it, funding is absolutely critical. Which is why it's so important we stop doing things that divert resources (or waste them completely). I'm not talking about spending money on entertainment, I'm talking about companies having to defend themselves from useless patents. Or companies defending themselves from idiot customers who didn't know the shiny device they just bought shouldn't be used in the bath. Or companies spending their resources inventing technologies that limit the utility of another technology (DRM). Or monopolies controlling a market.

Re:Decentrialization is key. (1)

kabocox (199019) | more than 8 years ago | (#14981445)

Trust me... I'm shocked myself. I remember a time when we didn't have cell phones, computers with hard drives (I miss my old IBM pc jr), internet, 4-7 channel TVs, and every thing else that is happening now... And I'm only 27.

Damn, I'm 27 and now I feel old. When I first read your post I was like, I can remember dad having a cell phone for the longest time. If I remember correctly it was either a bag phone or mounted in his company vehicle. Now, my wife, mom, dad, and two brothers have one. Computers with HD drives... I remember using our Apple IIC. I think my mom still has it and it was working if I ever cared to hook it up. Internet. My first exposure to the internet was in 1995 in Highschool. I had it in college. I've had dail up on and off again since 2000. 4-7 Channel TVs. Actually this could be a trick one. TVs really should only have a video selector switch. I grew up on Cable. We moved to the boons in junior high and only had ABC 3, NBC 6, CBS 12, FOX 16, UPN 33, 2 PBS stations. In college, I had basic cable. Since college, we've been strict DVD & VHS with rabbit ears every now and then.

Although I like the idea of exploring space and all the neat things that we could do up there, I'm gald we've said forget it. I'd rather have cheap entertainment and cell phones than have "a man" walk on the moon. If I could afford $50ish a month, we'd have broadband. If I spend the same amount again, I'd have cable or direct tv. There is a reason that I don't have those things... It's lack of money.

Re:Decentrialization is key. (2, Insightful)

HiThere (15173) | more than 8 years ago | (#14982277)

Although I like the idea of exploring space and all the neat things that we could do up there, I'm gald we've said forget it. I'd rather have cheap entertainment and cell phones than have "a man" walk on the moon.

To my mind this is very short-sighted. Perhaps it's appropriate that we have fallen back to regroup, but not going into space in a large scale is suicidal -- not on an individual basis, but for the species. The only question is the appropriate time frame. Perhaps it's appropriate that we stop and do a bit more development before another big push. This is very different from "stop and sit on our hands", however.

Toys are fun, but they're only really important if they're a step towards getting where you need to go. I enjoy computer games, but I don't really consider them important...except that gamers have helped push the development of computer technology.

Re:Decentrialization is key. (1)

RicktheBrick (588466) | more than 8 years ago | (#14981489)

Here are my predictions on what will be available in the year 2026. First factories will be built here in the United States but most of the worker will live in India or China. They will use the internet to control robots in the factory. Apartment buildings will have examination rooms where one will go and there will be robots controlled by a doctor in India or China who will be able to remotely do an exam even better than if the doctor was there in person. Every automobile will be part of a network where it will know the location, speed and direction of travel of every other automobile within 100 yards of it. The automobbile will know the speed limit and location of every stop sign or traffic light and it present condition. The automobile will take defensive steps to avoid an accident. Eventually all human will be part of a large network where all knowledge and experiences will be available to all humans.

Re:Decentrialization is key. (1)

bermudatriangleoflov (951747) | more than 8 years ago | (#14981917)

I agree with you on this...however I would add that no significant advances in space travel have occurred because there is no market for it. When the day comes that money can be made by sending people who makes 40 grand a year to mars, I guarantee you will see an explosion in space travel technology.

Re:Decentrialization is key. (1)

ceoyoyo (59147) | more than 8 years ago | (#14982312)

4-7 channel TVs? Am I reading that wrong? Growing up our TV SUPPORTED even more than 7 channels (I think it was something like 15). Of course, we could only get two, one kind of fuzzy. Except on those rare nights when everything was perfect sometimes you could get the sound from the French channel.

I've got a few hundred channels now though. :)

Re:Don't overestimate... (4, Insightful)

MrFlibbs (945469) | more than 8 years ago | (#14980508)

Indeed. One thing that's easily overlooked is that even though the hardware performance has increased exponentially, the software development has not. Those tasks that are compute-bound benefit directly from the exponential hardware growth, but other tasks do not.

Software is hard -- perhaps fundamentally so. It cannot be written exponentially faster even with infinite hardware resources. Vast hardware improvements may support vast software possibilities, but writing that software is still a daunting task.

Re:Don't overestimate... (1)

Spy der Mann (805235) | more than 8 years ago | (#14981289)

Software is hard -- perhaps fundamentally so.

Yes, because computers can't design software. Humans are the ones have to do it.

I've said before that there are still algorithms that need to be designed - intelligent audio compression thru sampling (if there's a piano, strip the necessary information and just store the notes and variations, if it's a voice, just store the vowels / consonants and pitch changes, of course, with the rest of the "noise" as high-fidelity info), sprite-based video compression (that's part of the MPEG4 standard)... speech recognition algorithms that WORK and can understand context...

What happened with all those wonderful futuristic predictions? Perhaps they belong to the future - the one that will never arrive.

Re:Don't overestimate... (1)

oliverthered (187439) | more than 8 years ago | (#14981628)

Task ' Write a web page'

a) download a trail of dreameweaver and write a simple web page, test your page with firefox.

b) Pop in a dos disk, fire up debug and write a text editor to write your web page, then write a GUI and web browser in ASM to test your web page, hand write all your browser tests too don't use the automated ones on the web.

There's been a lot of change in the way people write software, ever used a punch card and waited a day or two to get your debug results back?

Re:Don't overestimate... (1)

duffstone (946343) | more than 8 years ago | (#14982158)

Completely agree. 14 years from now won't look significantly different than today unless changes are made in the software development process. Not just the process but the actual language architecture used.

The best example I can give, is having to manually program decision making logic for every program I write. This should have already been streamlined. Why I have to tell a program how to think EVERY time I write a proc is a mystery to me.

IF/Then has been a crutch for software developers for far too long. There are other aspects of modern day programming that are just as repetitive and inefficient as well, I just couldn't think of a shorter example than this.

-Duff

Re:Don't overestimate... (1)

Frumious Wombat (845680) | more than 8 years ago | (#14981266)

Anecdotally, I ran across an old Trek episode a while back (the one with Kodos), and to bring up the voice-print files Spock is shuffling a handful of what appear to be 720K floppies. So, in the 1960s they could imagine transluminal transportation, extracting useful energy from fusion or matter/antimatter annihilation, and world peace, but they thought computers would be stuck with a storage medium that could only hold one voice-print per 3-4" square cartridge, that would be transported by SneakerNet. This is similar to RAH's stories from the 1950s where it was accepted that navigators in space would have memorized enormous logarithmic tables.

Here we are in 2006, and we're puttering in LEO, but our computers are large, fast, and well-connected enough to archive all of mankind's knowledge, simulate sub-molecular processes with usable accuracy, and monitor every communication going in and out of a 300 million person country. Makes you wonder what we're overlooking that's in development, but just under the radar.

New technologies change fast, older ones don't (1)

ChrisA90278 (905188) | more than 8 years ago | (#14981303)

Not all thechnology changes at the same rate. Microelectronics technology is far from mature and it's changinf fast. Chenical rocets has become a mature technology. Just look at the two newest large space bosters the Delta IV and the Atlas V. Mechanically there is nothing in there an engineer from the 1960s would not recognize. Same goes for the passigeer jet aircraft. Rapid changes in the first half of the 20th centerury and little change from the 1970's to present. I suspect that people in the 1930's figured that airplans in 2006 would be 1000 feet long and flay at 18 times the speed of sound and look like cruise ships inside complete with casinos and swimming pools. It didn't work out that way. They were just in a "bubble" of rapid aeroplane technology development and that pace was not sustainable. BTW, it is only microelectronics that is advancing quickly now. Computers have not changed much except to get cheaper and a little faster. And software has ot changed much at all either

Re:Don't overestimate... (1)

booch (4157) | more than 8 years ago | (#14981869)

Yes, but you'll notice that we already have 23rd-century communicators, tri-corders, data storage*, and display technology. As a sibling post points out, that's because these are easily commoditizable and there's lots of money to be made.

I think this bolsters your theory that it's the focus that matters most. And focus is largely a matter of markets and profitability.

* There was an episode in ST:TOS where they were plugging in 2.5-inch orange squares into a computer. I don't recall now what was on them. But when mini-discs came out in the early 1990s, I remarked that they were about the same size, and held roughly the same amount of data.

Re:Don't overestimate... (0)

Anonymous Coward | more than 8 years ago | (#14982263)

The Jupiter II optimistically took off in 1999

Erm, the moon was blasted out of earth's orbit Sept 13, 1999, but the Jupiter 2 was launched Oct 16, 1997 (my 43rd b'day).

Re:Don't overestimate... (1)

ceoyoyo (59147) | more than 8 years ago | (#14982281)

Knowledge appears to increase exponentially (actually, Kurzweil says double exponentially). That doesn't mean that individual technological solutions will advance the same way, particularly over short periods of time.

We haven't seen a boom in space because we're lacking new propulsion (think of all those SF shows -- I don't think any of them had us riding around on chemical rockets) and we haven't really put the will into it.

However, advances have been made. We have ion engines a la Star Wars now. People are starting to talk seriously about fusion drives. The old chemical drives have gotten to the point where things like the X Prize can happen.

The progress is there, we just saw a lull in that one particular area of engineering. That happens from time to time, when people lose their purpose.

Re:Don't underestimate... (1)

CRCulver (715279) | more than 8 years ago | (#14980322)

While I appreciate Kurzweil's insights in The Age of Spiritual Machines [amazon.com] , I sometimes wonder if the technological progress he foresees will be slowed down by companies trying to give consumers the least amount of computing power which they can and still be competitive and at the same time charging as much as possible. Add to that the very real possibility of a Luddite reaction against new technology, and Kurzweil's timeline doesn't seem so sure anymore.

Re:Don't underestimate... (2, Insightful)

Anonymous Coward | more than 8 years ago | (#14980384)

Within a few decades, machine intelligence will surpass human intelligence, leading to The Singularity -- technological change so rapid and profound it represents a rupture in the fabric of human history.

Hahaha. That shit is just too funny. "The Singularity" eh?

Let's just ignore the last 50-odd years of AI research. The problem is Real Fucking Hard (tm) and throwing more hardware at it just isn't working (see: Combainatorial Explosion, NP Complete, etc.). Computers are very good at doing mechanical things very quickly. Intelligent, they are not. Nor does it appear they are going to be intelligent any time soon (sorry SciFi fans). Don't worry though, they'll still kick your arse in chess.

I'd be quite pleased if he pointed out exactly which promising AI technology will lead to this "Singularity" instead just assuming it's going to be done.

Re:Don't underestimate... (4, Insightful)

Dr. GeneMachine (720233) | more than 8 years ago | (#14980398)

I actually don't buy into Kurzweil's singularity theory. I am not sure where he pulls that super-exponential growth figure from. Looking at past technological advances, I rather think that technological growth follows a succession of sigmoids. First you got a "buildup phase", followed by a very fast "breakthrough" phase, which slows down again, till the process settles on a plateau. Then there might be nothing for quite some time, till the next advancement phase sets in.

Such a development model might very well go on for a long time, without reaching a Kurzweil-style singularity.

Re:Don't underestimate... (1)

vertinox (846076) | more than 8 years ago | (#14980670)

First you got a "buildup phase", followed by a very fast "breakthrough" phase, which slows down again, till the process settles on a plateau.

Where is the plateau? Even in the Dark Ages, technology still advanced. (Albeit it was more on the lines of masonry, shipbuilding, arms and weapons manufacturing). A mounted armored knight in the 1200's was comparably better armed than a Roman Legionaire in 100AD.

But I suggest you read his book... I don't agree with everything he says or that a singularity will happen like he says, but he does point out the model of advancment isn't tied directly with a technology itself but all technologies.

In my view, a farm plow might lead to better production that leads to more free time and economic benefits and money gets put on other technologies that latter lead to mass production in factories which leader to further economic benefits and more investment in technologies that eventually lead to more investment into technology.

I remember in the early 1990's there wasn't much of a computer industry. Now most of the poeple I know have some related job to it. Technology generates more jobs which generates more movement in the economy (you know like people with jobs buying new iPods and new computers) which leads to further economic growth. Secondly, technology amplifies productivity in the work place. Not only in the manufacturing arena, but also in corporate offices... From the copiers, to email, to blackberries (heck there are specialists here that handle just those at my work), to server, to the vacuum cleaner repairs, to everything else.

In my view computers haven't even stopped to take a breather since 1994 (when I got my first 486) the evidence is right here in front of us. I bought my first cell phone in 2002. Black and white lcd bulky thing... Yet in 2004 I had a full VGA color slim flip phone.

Sure we might not see StrongAI or a singularitan even, but to say technology isn't improving at a drastic and accelerating rate is just silly.

Re:Don't underestimate... (1)

spaztik (917859) | more than 8 years ago | (#14980769)

I think what Kurzweil is getting at with the singularity theory is the precipice that once machine intelligence surpasses human intelligence, the machines will perform all the subsequent 'research' in the field of technology. Thats where the "exponential growth in the rate of exponential growth" theory plays in. Looking back at past technological advances only supports Kurzweil's theories seeing as all his works draw from past trends in technology. I really think Kurzweil knows his stuff.

Re:Don't underestimate... (1)

Traa (158207) | more than 8 years ago | (#14980802)

"I am not sure where he pulls that super-exponential growth figure from."
Ever heard of Moore's Law? The exponential growth of processing power that has been ongoing since the 60's. In his book "The Age of Spiritual Machines" Ray Kurzweil points out that this exponential growth can also be found in many more technological developments. He doesn't make this stuff up, it is pretty common knowledge. What Kurzweil does is point out that most of our prediction-of-the-future models are still based on industrial age linear growth models.

Re:Don't underestimate... (1)

TheRaven64 (641858) | more than 8 years ago | (#14981798)

I was at a talk earlier in the week by Bob Colwell (former Intel Fellow, leader of a few Pentium design teams). The focus of his talk was Moore's Law, and how it is no longer a useful guide. The important thing to remember about Moore's Law is that it makes economic, rather than technological predictions. It claims that the number of transistors it is economically feasible to put on a single IC doubles every n months (where n is somewhere between 12 and 24, depending on when you ask Gordon Moore).

According to the materials people at Intel, they can keep up this level of progress for at least the next 10 years. This means (going by transistor count), it will be possible to make a 2000[1] core P6. The catch? It will draw 200KW, need RAM far faster than is predicted to exist to keep it even 25% fed with data, and have an incredibly small potential target market of people who actually need that much power. In short, it will not be economically feasible to produce, even if it is technically feasible.

I do quite a lot of computationally expensive things, and I am rarely in a position where I need more than about 1GHz of CPU power on my laptop. If I am running big computational jobs, I use one of the clusters or supercomputers in the back room (and I am in a tiny minority of computer users needing even that).

The future of computing is social rather than technical. The vast majority of computers now use either ARM or PowerPC chips. Their users don't call them computers, they call them telephones. Some of them are called consoles. Bob said we are leaving the PC era and entering the ubicomp era, and I agree.

[1] I think this was the number. It was something huge, anyway.

Re:Don't underestimate... (1)

booch (4157) | more than 8 years ago | (#14981937)

According to the materials people at Intel, they can keep up this level of progress for at least the next 10 years. This means (going by transistor count), it will be possible to make a 2000 core P6. The catch? It will draw 200KW

I think you either misunderstood, he was exaggerating, or the materials people at Intel are completely full of shit. There's no way they've got materials or technology that can dissipate 200KW of heat off of a computer chip, no matter how much money they throw at the problem.

Re:Don't underestimate... (1)

TheRaven64 (641858) | more than 8 years ago | (#14982142)

This was his point. They can put that many transistors on the chip, but powering and cooling them is going to be impossible. No one is going to want a laptop that drinks several gallons of liquid nitrogen a second just to stop the chip burning through the case, desk, and floor, and requires its own electricity sub station to power.

Re:Don't underestimate... (1)

Bogtha (906264) | more than 8 years ago | (#14980973)

First you got a "buildup phase", followed by a very fast "breakthrough" phase, which slows down again, till the process settles on a plateau.

An alternative way of looking at it is, that first you invent something, and market it to rich people. Then, the focus doesn't move to inventing something else, it moves to making the existing thing cost effective so that it can be marketed to everybody else. Then, after that, the focus still doesn't move to inventing something else, it moves to refinement of the existing thing, to make it more profitable and to compete against other people marketing similar things.

However, this only makes sense if you focus on particular markets. It doesn't apply if you are looking at what people across the whole of society are inventing. A particular industry might match the above description, but that doesn't mean science as a whole does.

Such a development model might very well go on for a long time, without reaching a Kurzweil-style singularity.

The key to the singularity is not that invention simply speeds up, it's that inventions tend to speed up the rate of future invention. For example, imagine if the computer hadn't been invented. Think of everything that's been invented since computers became commonplace. Would all of those things have been invented if computers hadn't been around?

This applies to other inventions/discoveries too - writing, electricity, power conversion, the internal combustion engine, flight... you name it. And it will apply to future inventions too. Advancements in artificial intelligence, space travel, nanotechnology and power conversion will have similar effects on the rate of invention.

Re:Don't underestimate... (0)

Anonymous Coward | more than 8 years ago | (#14980453)

Quote:

"There's even exponential growth in the rate of exponential growth."

Duh, I bet there's even exponential growth in the rate of exponential growth in the rate of exponential growth, too.

Anyone with a (halfway decent) education care to guess where this will end?

Re:Don't underestimate... (1)

SimilarityEngine (892055) | more than 8 years ago | (#14980812)

I didn't RTFA, but perhaps he's claiming that the growth is described by a function such as exp(t*exp(t)), or even exp(exp(t)), rather than just pointing out the obvious fact that (d/dt)(exp(t)) = exp(t) ? The wording (at least in the summary) is far too ambiguous, and in any case how do we define a numerical "rate of progress"?

Re:Don't underestimate... (1)

joshv (13017) | more than 8 years ago | (#14980493)

Responding to Kurzweil. Exponential growth is a mathematical concept. The rate of change of a quantity is proportional to the amount of the quantity. This presumes the ability to measure that quantity. How exactly does one measure "technology" or "progress" in sufficient detail to determine that they are increasing at an exponential rate? Sure, one can make qualitative observations. It seems that the more technology there is, the more quickly we can design new technology. But that's an awfully fuzzy concept upon which to base a mathematical claim of exponential behavior. What about lock-in? How do we know that the current (massive) installed base of IP based networks and computers hasn't prevented a new and better technology from making us all ten times as productive?

Even if things actually are exponential at the moment, there is also the issue of the time scale. Over sufficiently small time scales many growth curves can appear exponential. The most famous example being the S curve, the growth curve of bacteria in a petri dish (and humans as it is beginning to appear). I am sure railroad building looked exponential for a few decades of the 1800s, but obviously there are real limits to the miles of railroad tracks you can lay.

I think underlying all of this is the false impression of infinite capacity that has arisen out of the phenomenal increase in computer power of the last 4 or 5 decades. Sure, I can now house a terabyte of data under my desk. That certainly feels limitless. And geez, who is to say that in ten years it won't be a petabyte. The perception is one of infinite capacity, as the limits recede over the horizon as fast as we approach them. But rest assured, all physical attributes of computers are now, and forever will be finite. There are real limits to the amount of computation a particular piece of matter can accomplish, real bounds on the amount of power required for that computation, and last I checked, only a finite amount of matter available for conversion into computational devices.

Re:Don't underestimate... (1)

Victor_Os (677960) | more than 8 years ago | (#14980516)

The Singularity
He misspelled "Prime Intellect"

Re:Don't underestimate... (1)

CastrTroy (595695) | more than 8 years ago | (#14980569)

It seems like technological feats happened much quicker in the past. The building of the Pyramids, The Great wall of china, Cross continental railroads. We can't even build a Secure OS, or a laptop that doesn't toast your lap.

Re:Don't underestimate... (0)

Anonymous Coward | more than 8 years ago | (#14980831)

The Great Wall of China took centuries to build and pyramids were often not completed upon the death of the pharo that was to be buried in them and were created over a 3000+ year time span. We have been working on an os for at most 40-50 years, of those security has been a major concern for maybe 20? Your concept of quick seems a bit off ;)

Re:Don't underestimate... (1)

72Nova (935275) | more than 8 years ago | (#14981207)

"There's even exponential growth in the rate of exponential growth."

This isn't out of the ordinary. d/dx(e^x)=e^x. All exponential growth rates grow exponentially, that's kinda the defining characteristic of exponential functions.

Re:Don't underestimate... (0)

Anonymous Coward | more than 8 years ago | (#14981323)

In the future our ability to predict the future will expand exponentially. When graphed, we will do a loop. Swear to god.

Kurzweil (1)

dargaud (518470) | more than 8 years ago | (#14981498)

I've been reading some of Kurzweil's articles for over 2 decades and 99% of the time I call bullshit. He always promises AI but he (or others) never comes up with anything close to his predictions. I have no idea where he gets his projections from, even if his (?) singularity theory kinda makes sense (once the computing power of a CPU gets above that of the human brain thanks to Moore's Law, all hell breaks loose). Just to say that putting 10 thousand pocket calculators next ot each other doesn't make it an AMD x64...

Future is Dim... (1)

Simonetta (207550) | more than 8 years ago | (#14981876)

All projections of the future are based on a continuation of middle-class life in the USA, Japan, and Europe as it is today only more so.

    Probably won't be that way. Oil is peaking. Which means that the easy oil is gone and what's left won't be easy to get to. Or easy to pump out. Or protect from pirates, terrorists, or religious fanatics.

    Plus...

    The world's population continues to explode. Billions of more young people are becoming mature (15-20 years old) and find that there are no jobs available. It will be easy to blame everything on the 'rich' (which means, you and me). For the young, learning that there are no jobs and no futures and no money means they have nothing to lose by joining up for a big war 'against the infidels' (again, me and you).

    This means MASSIVE price and supply distruptions in the oil markets. Since we use oil for everything: food, clothing, shelter, transportion, communications, financial structure, and electronics: then there will be major distruptions in everything.

    That includes science and computing.

    Major disruptions means that things are not going to get a whole lot better in the next 30 to 50 years.
The people who tell you differently are either dreamers or fools, or they don't understand the implications of Peak Oil.

    And the world's population continues to explode and billions of more young people continue to mature to adulthood while all this is happening.

Re:Don't underestimate... (1)

FridayBob (619244) | more than 8 years ago | (#14982067)

Whoa, hey! Kurzweil is definitely out there, huh? :-)

Back here on Earth, I'm not so sure things are going to move along so quickly. For instance, a concept I would love to see developed in my lifetime is a wearable computer with:
  • AI
  • voice interface
  • persistent wireless connection to the Internet
  • lots of memory

It would also come with (optional) things like:
  • HUD (glasses or contact lenses)
  • miniature cameras (possibly infrared)
  • GPS

Think of what you could do with a device like this and how it would profoundly change society! However, even if we were able to develop a good enough voice interface without the AI, the damn telecos could still end up rendering these devices less than half as useful if they were never to offer people flat-rate, broadband, wireless Internet connections for these devices. It's the same problem as with those mobile phones that you can't buy in the States because the profit margins for them aren't considered to be high enough. High-tech costs a lot of money, and I fear that a lot of interesting concepts like this may never reach the market because they don't make good enough business sense to the dominant players.

In another situation, suppose good old Microsoft were to develop the wearable computer I outlined above. Would you still want one? I'm not so sure I would: the potential for abuse is too great to ignore. This could lead to these devices not only being mistrusted, but even outlawed. And what would we have to thank for the death of this wonderful concept... Microsoft? Uh, uh: market forces. The market giveth, and the market taketh away.

In other words, I feel that, due to market forces, the road to Kurzweil's digital utopia is probably going to be a little longer and bumpier than he'd like to believe.

The Future is Linux (-1)

Anonymous Coward | more than 8 years ago | (#14980247)

_d8b____________________d8b_______d8,
_?88____________________88P______`8P
__88b__________________d88
__888888b__.d888b,_d888888________88b_.d888b,
__88P_`?8b_?8b,___d8P'_?88________88P_?8b,
_d88,__d88___`?8b_88b__,88b______d88____`?8b
d88'`?88P'`?888P'_`?88P'`88b____d88'_`?888P'

______d8b________________________d8b
______88P________________________88P
_____d88________________________d88
_d888888___d8888b_d888b8b___d888888
d8P'_?88__d8b_,dPd8P'_?88__d8P'_?88
88b__,88b_88b____88b__,88b_88b__,88b
`?88P'`88b`?888P'`?88P'`88b`?88P'`88b

OUR FINEST FAGS WE BRING (-1, Troll)

Anonymous Coward | more than 8 years ago | (#14980250)

fuck the jewish people, they should all be gassed

sieg heil

Nature? (-1, Flamebait)

suso (153703) | more than 8 years ago | (#14980258)

What happened to the article on Nature using flawed research to conclude that wikipedia is only slightly less accurate than Britainica?

Which Nature magazine? (1, Interesting)

caluml (551744) | more than 8 years ago | (#14980260)

Is this the same Nature magazine [theregister.co.uk] that made stuff up to suit it's purposes about Wikipedia?

Re:Which Nature magazine? (0, Offtopic)

caluml (551744) | more than 8 years ago | (#14980319)

What? Why is this flamebait?

Re:Which Nature magazine? (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#14980333)

Because you're trying to dig up a previous story that the editors deleted. Why they deleted it is an interesting question, but it's usually considered a taboo subject once they do.

Re:Which Nature magazine? (1, Insightful)

Roj Blake (931541) | more than 8 years ago | (#14980335)

It's flamebait because you pointed out valid criticisms of a slashmind accepted 'fact'.

Re:Which Nature magazine? (1)

goldspider (445116) | more than 8 years ago | (#14980418)

And any subsequent discussion of said posts will be summarily downmodded as well.

I predict (0, Funny)

Anonymous Coward | more than 8 years ago | (#14980290)

a lot more ones and zeros.

Re:I predict (1, Funny)

Anonymous Coward | more than 8 years ago | (#14980530)

If you're talking about the moderation of posts on this article, then I totally agree.

Trends (5, Insightful)

Red_Foreman (877991) | more than 8 years ago | (#14980305)

There's two distinct movements, and in 2020 we could see one trend finally win out over the other, for better or for worse.

One trend is the Open Source movement, the other is the closed source / DRM movement.

The way I see it, one of two things could happen: Computing becomes nearly free, due to lower and lower hardware costs and free operating systems, with entertainment at our fingertips, or... an extreme DRM lockdown where only "trusted" devices may connect and Linux becomes contraband.

Re:Trends (2, Insightful)

hal2814 (725639) | more than 8 years ago | (#14980488)

Despite what you've read in the GPL3, open source and DRM are not mutually exclusive. Just because you can read the source code on how a DRM scheme works does not mean that you can bypass it. DRM also won't neccessarily lead to the demise of Linux. There are too many Linux shops who are not going to be willing to switch server platforms over trusted computing measures to ever let that happen. I'm not the biggest fan of DRM but it's probably going to be here to stay and it's not going to lead the the end of the OSS movement. The sky is very much where it always has been and won't be falling by the year 2020.

Re:Trends (1, Insightful)

Anonymous Coward | more than 8 years ago | (#14981125)

Despite what you've read in the GPL3, open source and DRM are not mutually exclusive. Just because you can read the source code on how a DRM scheme works does not mean that you can bypass it.

Thanks for stating a truism that the GPL v3 explains in great detail. Perhaps you should read it again. The problem is Trusted Computing (TCG) hardware... this controls access to data based on the digital signature of the executable code. With this hardware you can have the source code... but you can't modify it and still have an executable that still works properly. You can't even simply recompile it yourself without modification. You do not have the key necessary to sign it and make it "official"... you *must* only used approved binaries. And this doesn't just apply to music and video either. It's *any* digital data (including software).

DRM cannot be done with any software that you can modify... hence the push by Sun for it's "open source" DRM -- which is reliant on TCG hardware to work at all. Their abuse of the term "open source" is a massive calumny. They know all too well that it's no more open than Microsoft Office.

Re:Trends (1)

ceoyoyo (59147) | more than 8 years ago | (#14982163)

As the AC pointed out, you're right, DRM isn't necessarily contrary to open source (although practically...) but trusted computing (a form of DRM) IS.

I disagree with him that you can't implement open source DRM (I bet you can) but it's harder and not nearly as secure as trusted computing supported DRM.

Re:Trends (1, Interesting)

Anonymous Coward | more than 8 years ago | (#14980824)

In other words, it all comes down to government interference in the market. In the absence of government, there could be no such thing as "contraband" or "prohibition". The only thing prohibited would be coercion, and the only thing mandated would be voluntary association -- exactly the way human nature intended it to be.

Re:Trends (1)

zen-theorist (930637) | more than 8 years ago | (#14981268)

TA talks about the relation about computers and science: why is it that all slashdotters translate science to technology, and start yapping about open source and closed source and javascript and windows vista and xboxes?

what happened to the P vs NP question, making A.L.I.C.E. talk more sense and so on? that, now, would be science..

What a relief (0)

Anonymous Coward | more than 8 years ago | (#14980337)

Whew, they didn't mention Microsoft or Windows once!

Slashdot censorship! (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#14980340)

Slashdot is censoring comments critical of the site!

Go to this thread [slashdot.org] (note -1, nested) and search all three pages for this comment [slashdot.org] . Try searching for "grammar" or "it's official dell". You can't find it. The only way to find it is to go to Search Discussion and type in the title!

The only conclusion is that Slashdot is hiding posts critical of the website!

Normally such comments are modded to -1. But I guess when this doesn't happen the editors have no choice but to manually blacklist the comment!

Vinge dissappoints (2, Insightful)

ObjetDart (700355) | more than 8 years ago | (#14980377)

Was anyone else as completely underwhelmed by Vinge's article as I was? For a man who has produced so many incredible, original visions in the past, he seems to be stuck in a bit of a rut these days, going on and on about ubiquitous computing. There wasn't a single idea in his article that I haven't heard many times before already, from him and others. It reads like something he cranked out in 10 minutes to meet some last minute deadline...

Re:Vinge dissappoints (1)

hopeless case (49791) | more than 8 years ago | (#14982183)

I felt exactly the same way.

I am so used to being blown away by Vinge.

He doesn't often write but when he does, he puts a lot of thought into it.

He didn't put much thought into this article.

No high hopes (4, Insightful)

hcdejong (561314) | more than 8 years ago | (#14980394)

Compare the state of computing in 1990 with that of today. Yes, computers are immensely faster than they were 15 years ago, but have things changed on a fundamental level? Have computers become more *intelligent*, rather than just faster? I, for one, am disappointed.

An example: handling contact and scheduling information. In 1993, Apple showed how it should be done with the Newton. 13 years on, the most popular application (Outlook) still doesn't have that level of functionality.

Computers were supposed to make things easier for us. Instead, they all too often complicate things needlessly.

Yes, thanks to better hardware, more tasks have become feasible to do on a computer. Video playback, massive networks like the internet are very nice.

But while new functions are being added, existing software stagnates. Mac OS X is nice and robust, but UI improvements over Mac System 7 are tiny to nonexistent. Windows shows a similar lack of progress. Word processing is not fundamentally different from 1984.

Re:No high hopes (3, Interesting)

nowhere.elysium (924845) | more than 8 years ago | (#14980547)

This has more to do with the fact that people are becoming increasingly blase about the potential of coputing. at the moment, i am actually undertaking a project to try and design a human/computer interface that is totally removed from what engelbart came up with back in '68 - we're trying, essentially, to show people that thinking outside the box is the best way to improve the use of said box. computers these days are capable of amazing things in 3-dimensional graphics, but we're still constrained by the 2-dimensional 'navigation' methods. instead of breaking barriers, and then returning to safe territory, how's about we burn the return bridges a bit? what i (and my group) are doing is to re-invent man/machine interaction; we're not, strictly speaking, doing anything very new; we're just trying to do it differently. the problem that people have imposed on themselves is a desperate love of throwbacks; i'm sitting here, typing on a keyboard not entirely dissimilar to the typewriters of the 1880s. we've got a machine that can calculate variances in chaos theory sitting under our desks, and we're still treating them as if they were mechanical, hand-milled machines. we need to learn to progress in of ourselves, as well as our technologies.

Re:No high hopes (1)

HiThere (15173) | more than 8 years ago | (#14982129)

I really doubt that you can dramatically improve things by adding another geometric dimension. If you want to improve the interface, then the improvement will need to be based on the improved understanding (by the computer) of the interface. A 3-D interface is probably not going to be particularly useful. Various people keep trying to prove me wrong about this, but to my mind what that would do is increase the computational requirements on the user, with minimal to no improvement on the information throughput. This is the opposite of what is needed.

If you want to improve the interface, you need to make it more TRANSPARENT. Less something that gets in the way. This is, indeed, a difficult thing to accomplish. Currently the text interface to computers is based around ink on paper, the most efficient technique that people have ever developed for transmitting information asynchronously. It's made some moderate improvements, but I still prefer to read a book than to read a computer screen. The book is more transparent.

The main advantage that computers, as a media, currently have (besides eye-candy) is during the creative process. I've *used* typewriters. Computers are so much better it's nearly unbelievable. But for reading, ink on paper is better. (Immersive games are something different...but I think that they are apart from the rest of what I'm saying. They can be educational tools, but when adapted for that purpose they are often quite dull, and they don't seem to have any other use. Prove me wrong!)

One possibility that I see for an interface which would, in certain circumstances, be superior is based around physical modeling. This requires the computer to better understand the physical world, so that if can interpret what a camera is seeing. This would be an interface for telefactors that highlighted things needing attention. Unfortunately, this isn't one simple thing, but a complex, where different uses of telefactors would require different understandings. A surgeon might use this interface to operate on a patient in Paris from Tokyo, e.g. Here the potential improvement would be to clarify the boundaries between internal organs, and remove clouding caused by blood (either by visual processing, or by controlling some local entity, human, robot, or telefactor, and causing them to clear the visual field.

As you can see while this would be important, it would also be very specialized. A telefactor pulling cables would need a different system.

Notes:
A telefactor is the controlling mechanism for a waldo. If a human is operating the telefactor, the resultant system is called telepresence. If a computer is operating the telefactor, the resultant system is called a robot. I have no idea what you call a system that switches back and forth between being controlled by a computer and a human.

Re:No high hopes (1)

ceoyoyo (59147) | more than 8 years ago | (#14982216)

Agreed. People are fond of thinking 3D is the way to go but as yet our displays are very much 2D, as are our input devices. I have yet to see a GOOD, convenient THEORETICAL 3D display technology. Input is easier, but even so there are very few actual pieces of hardware and I haven't seen one of those that's the equivalent of a mouse.

Re:No high hopes (1)

CouchP (769984) | more than 8 years ago | (#14980649)

I think what we are waiting for, is something akin to language. You mention that word processors have hardly changed since 84. I say, a radical change in that particular application will not happen, until a radical change in the social need for the written word. What I mean is, technology in a specific area of application seems to hit a point that further improvements become linier or even flat. This is because of the nature of sorting out the good methods from the superb methods. Windows and OSX suffer the same fate for now, since the current need for computers is limited to 2 dimensional word processors and spreadsheets etc. Once there is a need at the corporate or personal level to work in the next media, 3d for instance, and not just gaming, then the fundamental way of doing things will shift and the exponential growth will begin anew. We see now that it may be possible one day to travel to other stars, but until the NEED to do so is present, then it will not have that exponential growth. My arguments are tenuous, and a little off the cuff, but I think there may be some truth or meaning in them. Thanks

Re:No high hopes (1)

master_p (608214) | more than 8 years ago | (#14981195)

Compare the state of programming with the 60s. Which programming languages are we using today? C++, Java, Perl, Python, etc...in other words, ALGOL, disguised in various forms.

We can have no real progress until we have AI that we can just describe what we want to it and it understands it. Only then we can make computers as clever as we imagine they can be.

1984 (2, Funny)

2008 (900939) | more than 8 years ago | (#14981338)

Of course word processing hasn't changed since 1984. LaTeX and GNU Emacs were written in 1984... how could you improve on that?

Re:1984 (1)

Spaceman40 (565797) | more than 8 years ago | (#14981619)

Hey, the best programming language was written in the 60s: Lisp!

(only partially joking)

Re:No high hopes (1)

ribuck (943217) | more than 8 years ago | (#14981758)

> Word processing is not fundamentally different from 1984

In 1986 I was using Microsoft Word for DOS. It had menus with keystroke shortcuts, an interactive tutorial, stylesheets and mail-merge. It supported PostScript output, and it had every formatting option that I needed to produce professional-quality documents.

It also had a consistency and elegance in its structure that is sadly lacking in much of today's bloatware.

Sure, today's version of Word will export to HTML and has WYSIWYG anti-aliased fonts, and can embed Excel spreadsheets, but the 1986 version was more pleasurable to use.

Robotics (0)

Fedarkyn (892041) | more than 8 years ago | (#14980413)

the increase in computing power will make possible (even banal) to have humanoid robots at home...

Who is Vernor Vinge? (2, Informative)

resonte (900899) | more than 8 years ago | (#14980432)

In case you wanted to know

Vernor Vinge is a sci-fiction author who was the first to coin up the term singularity, and uses the idea in some of his novels. Linkie: http://mindstalk.net/vinge/vinge-sing.html [mindstalk.net]

If you would like to read one of his books I would suggest Across Realtime, which touches on this subject lightly. Although his other stories are somewhat less palatable for me (but I've only read three).

Other authors who delve more deeply into singularity issues are Greg Egan (hard going, but definatly worth reading) http://gregegan.customer.netspace.net.au/ [netspace.net.au] , Charles Stross's Accelerando http://www.accelerando.org/_static/accelerando.htm l [accelerando.org] , and .

Science fiction is odd as a genre since the authors minds are affected by the technology they see possible at the time of writing. Science fiction writers in the past depicted a future with minimal use of networked computers for instance. So the theme seems to change over time, whereas other genres remain pretty static.

No more computers by year-end 2005 (1, Offtopic)

fastgood (714723) | more than 8 years ago | (#14980449)

Three Dell TV spots recently did not even mention the word "computer" once.

An HP newspaper ad last week contained 500+ words of copy, and the closest they got to that naughty bit [wiktionary.org] word was in the fine print with the phrase "computing environment" when referring to thin client boxes connecting to a server.

C**PUTERS are obsolete for Dell and Hewlett Packard!

Re:No more computers by year-end 2005 (1)

ch-chuck (9622) | more than 8 years ago | (#14980631)

Could be - in the late 50's people were fed up with "Univac" and job destroying "giant brains" and all so much that Digital Equipment Corp. didn't even call their machines "computers". They were "Programmable Data Processors" (PDP). If you look at radio magazines from the 20's and 30's Hugo Gernsback was envisionsing radio as the solution for everything - radio weapons, radio movies (TV), radio medicine, radio prospecting, blah blah blah blah. Now 60 years later you just make a call on your cell phone and go about normal life without even thinking about the underlying technology that makes it possible. In 2020 computers will be as ubiquitous as radio, tv, automobiles, etc - just they will be invisible to normal life, only noticed when they FAIL to function.

Future prediction in technology is foolish (2, Insightful)

Opportunist (166417) | more than 8 years ago | (#14980495)

Let's take a parallel in the space race of the 60s. Everyone expected the development to continue in the same pace it did during the 60s. I mean, face it, between 60 and 70, the technology changed from being able to lift some rather small mass into orbit (well, at least sometimes, most of the time it just went up in smoke) to bringing a 3 man craft including lander, car and a lot more junk to our moon! People extrapolated. 60 to 70: Zero to moon. 70 to 80: Flight to moon -> Moon base. 80-90: Mars. 90-2000: Past the asteroid belt and prolly even more.

Now, what people didn't take into consideration was that, with the race over, funding stopped. No more money for the NASA, no more leaps in science.

Same could happen to us and computers. Now, it is of course vastly different since there isn't only one customer (like in the space race, the only customer was the feds, and when they don't want your stuff anymore, you're outta biz), but it all depends now if the "consumer base" for the computer market is willing to spend the money. There are SO many issues intertwined that influence the market and thus development, that it's virtually impossible to predict what is going to be in 5 years, but trying to give an even remotely sensible prediction for 15 years is impossible.

Too many factors play into it. Sure, you can extrapolate what COULD be, considering the technology we have now and the speed in which technology CAN evolve. Whether it does will highly depend on where our priorities lie. DRM, will it kill development with less companies daring to get into the market, or will it increase development since DRM technology swallows away huge amounts of cycles? Legislative, patents and copyright, how will the market react? Will we let it happen or will we refuse to play along? Are we descending to being consuming drones or will there be a revolt against the practice of abusive patents?

Too many variables. Too many "what if"s.

Re:Future prediction in technology is foolish (0)

Anonymous Coward | more than 8 years ago | (#14980595)

You also have to take into account that the moon landing was faked, and so the technoglogy at the time was much overestimated.

Re:Future prediction in technology is foolish (1)

Opportunist (166417) | more than 8 years ago | (#14981354)

Well, we got vaporware today, too, so that variable can be left out of the loop.

Not to be Funny But... (1, Offtopic)

eno2001 (527078) | more than 8 years ago | (#14980511)

...that all depends on how we define science in the next few decades. Currently science is under attack for political reasons and will likely experience some level of change if something isn't done to curb the diretion things are going. The political motivation behind the current debasement of science is money. There are some very wealthy people who stand to make, keep or lose a lot of money depending on how much science the average person is made aware of. Those people are trying to muddy the waters and bring pseudoscience and fantasy (Intelligent Design, UFOs, Angels, and the like) to the same level of respect that science once held. Sadly, it appears to be working since there are many average people out there who would rather believe in old myths reframed in today's culture than actually dig into real scientific explanations for certain things.

Another part of the problem is that many of the "scientists" themselves are the people with the money and political motivation to keep science from the masses. (I put scientists in quotes since they tend to be more businessmen than scientists which is usually a horrible combination when it comes to society at large) A perfect example is Donald Rumsfeld and his connection to the Searle corporation. Searle developed Nutrasweet by way of serendipity while researching some medicines. The administrations before Reagan would not OK Nutrasweet for mass consumption. This was obviously detrimental to Searle's, and Rumsfeld's cash flow. So during the Reagan administration, a former employee of Searle was appointed to the FDA by Ronald Reagan. This appointee only did one thing and then resigned. He approved Nutrasweet for mass consumption. (Look it up if you don't believe me) The reason that Nutrasweet was not approved before this time was that too many animal tests indicated that Nutrasweet could cause tumors and a wide variety of illness. Many of these illnesses are not life threatening, but are discomforting enough to cause the sufferer to seek out medical attention. Usually on inspection, the doctor will prescribe medication from a large pharmaceutical company (in some cases Searle) which will take care of the symptoms but will be required for the patient as long as they suffer those symptoms. Nice perpetual motion money making system there...

So some scientists are crooked and simply work to further the interests of their employers rather than improving the human condition. Other scientists who work to improve our understanding of the universe, our planet and ourselves are rarely rewarded for their work and these days are being attacked as "heretics" in our newly "christian nation". So I would say that if things continue as they are going now, we'll have science churches that preach Intelligent Design and prisons to put the heretical non-christian scientists in. The computers of the day will be nothing more than glorified televisions that pass along the "wisdom" of the christian sanctioned "scientists". And the corporations and governmental bodies who live off of these systems will employ the wealthiest people in the world. Assuming that the rest of the world doesn't wise up and bomb us into oblivion...

Re:Not to be Funny But... (0)

Anonymous Coward | more than 8 years ago | (#14981692)

Science is under attack today. It has lost its hollowed ring of universal inspiration, its selfless devotion to the man kind, its loafy goals and great achievments. This is partly the fault of those who distort history to make science and scientists into something they are not; science is accumulative and incremental, scientists make unbelievable mistakes in hind sight and act like an ass more often than not; and those are worthy ones, taking more risks and staking more of their individuals than today's academic scientists. Careerists don't make good scientists, yet they fill most academic posts. This corrupts the whole system, the research universities they staff, the journals they review for, the papers they write; the list goes on and on. Then there is the question of their moral standing; I would like to think physics suffered more than a little setback when physicists exploded the atomic bomb. This is tied to the fundamental problem of state funded, and therefore state directed, research. Science is amoral, scientists should not be.

The truth is scientists are not made of stuff like they used to, they are careers and in business now, big careers and big business. But don't all tall things fall? I wouldn't want to be around when this domino does.

PS: I wish people would stop treating churchs as the devil; at least their very existence for as long there is society shows some value. It is not their fault science has lost its hold over so many people.

Wrong focus (2, Interesting)

jettoki (894493) | more than 8 years ago | (#14980538)

I'm not very concerned with progress in hardware. My 3 year old computer runs pretty much anything just fine, and I expect it to continue doing so for a few years to come, at least. Right now, I'm severely disappointed by the lack of ideas in technology. There's only so far you can take word processing, e-mail, scheduling, etc. Enough with 'innovation' in those areas, already!

What I'd really like to see is improved content creation tools. How about 3D scanners, so Joe Artmajor can easily scan his sculptures into modelling programs? They exist, but they aren't on the consumer market yet. I'd rather see that than another few years of GPU speed wars.

MOD PARENT UP! (1)

Spy der Mann (805235) | more than 8 years ago | (#14981320)

What I'd really like to see is improved content creation tools. How about 3D scanners, so Joe Artmajor can easily scan his sculptures into modelling programs?

And I'm still waiting for the do-it-yourself anime rendering program :(

Anyway, mod parent up. He's so right about this one.

China and India (0)

Anonymous Coward | more than 8 years ago | (#14980542)

While people focus on the hardware aspects of the future of computing, I think the most earth altering change will come from the emerging engineer classes of China and India. I think the vast increase in the number of people working on technical problems (even if there is some redundancy) will have a far greater effect on the world than any hardware developments possibly could. The big assumption here is that the rise in technologists in India and China will dramatically increase the global pool of engineers (instead of just moving it around). This is a huge assumption, by no means a given and perhaps a best case scenario.

Assuming no major disruptions occur, the giant new advanced labor pools in China and India will have 15-20 years of experience by the time 2020 rolls around. The increase in experience and investment will have these countries turning out unique inventions, technologies and discoveries at rates comparable to western nations today.

Hopefully, others will follow India and China's examples and work on educating and training their populations. Countries formerly unable to fulfill their own basic needs, posing a danger and a drain to other countries, could begin not only solving more of their own problems but postively contributing to the world at large.

The utilization of the world's nerds will transform the world for the better. Nerds of the world, unite!

Leave Technology to China and India (0)

Anonymous Coward | more than 8 years ago | (#14981265)

By 2020 the US should be exporting our "surplus females" to China and India since they both practice gender biased abortions against females.

Many provinces in India are reporting large skews which show in some places like Punjab only 500 female births per 1000 male births.

China is just as bad or worse.

All of these males will someday maybe want to touch a female besides their mother.

And we can supply them. We got plenty of "surplus females".

Future of technology (1)

bsieloff (963111) | more than 8 years ago | (#14980776)

It still kills me that we travel in a metal beast built around the concept of controlling explosions. Just seems silly to me. I am looking forward to what the future of computing does bring. The more we learn and accomplish the faster we will be able to realize our next dream.

A Singularity, madam. (2, Insightful)

clydemaxwell (935315) | more than 8 years ago | (#14980816)

I think a singularity is possible, by the definition of "a point beyond which we cannot hypothesize", because we cannot truly conceive/understand of that point. But will it necessarily be AI, or even computers, that create this? It's about as likely as extraterrestrial contact. Which is, you'll note, also a singularity.

The Immortal Words (2, Funny)

uberjoe (726765) | more than 8 years ago | (#14981111)

Of Professor Frink:

"I predict that within 100 years computers will be twice as powerful, 10,000 times larger, and so expensive that only the five richest kings in Europe will own them."

Re:The Immortal Words (1)

$0.02 (618911) | more than 8 years ago | (#14981192)

but will they run Linux?

Waste of time (2, Interesting)

HangingChad (677530) | more than 8 years ago | (#14981165)

For 20 years I've been hearing about the future of computing and when the future gets to be the present it doesn't really look anything like the future that was previously described. So to me that whole line of speculation is just a waste of time.

The truth is you don't know which technologies will take or why. Sometimes you think X should be popular but it doesn't catch on for 10 years after you found it. Or something you blow off as insignificant comes out of nowhere to dominate a market.

Although I have noticed one small arena that tends to be a good predictor of the wider market. If p0rn distributors pick it up, then you can almost bet it's going to be the next insanely great thing. I remember taking a training class for a streaming video server in Atlanta a few years ago. Half my classmates were from p0rn distributors. Which definitely made break time more interesting.

Milestones in scientific computing (1)

hpcanswers (960441) | more than 8 years ago | (#14981317)

I was surprised by some of the items missing from Jacqueline Ruttimann's list of "milestones in scientific computing." Perhaps the most glaring omissions are those that have lead to supercomputing-for-the-masses: the wide-spread use of clusters that have dramatically lowered the cost of computing systems, the adoption of MPI for portable software, the development of programs like MATLAB and Mathematica that greatly ease programming, etc. He does list the NSF's supercomputing centers, though chances are that few researches actually use them today.

The Future of Computing: Non-algorithmic Software (2, Insightful)

Louis Savain (65843) | more than 8 years ago | (#14981333)

Consider that our basic approach to computer programming has not changed in over a century and a half. It all started when Lady Ada Lovelace wrote the first algorithm (or table of instructions) for Babbage's gear-driven analytical engine. Software construction has been based on the algorithm ever since. As a result, we are now struggling with hundreds of operating systems and programming languages and the ensuing unreliability and unmanageable complexity. It's a veritable tower of Babel. Computing will not reach its true potential unless and until we abandon the algorithmic model and embrace a non-algorithmic, signal-based, synchronous software model. Only then will we be able to guarantee that our software systems are free of defects. There will be no limit to their complexity.

I had to stop reading this article.. (0)

Anonymous Coward | more than 8 years ago | (#14981829)


    This article went on for 3 pages, and said absolutely nothing. Ugh.

    Want to know what I think is going to happen? Collections of personal items, collections of group items, and the collaborative sharing of those items. We see the first baby steps of that today with P2P and MySpace/LiveJournal/Friendster'ish portals. Not too far ahead from now, it will boil down to a protocol, with corresponding client and server pieces. The same evolution has been mirrored in everything from ftp to http. Think sharep, a protocol for the universal sharing of personal data collections.

    In the meantime, i'd be willing to settle for having my desktop be rewindable [google.com] and having an intelligent way to index and crossreference my pictures and music. My 2006 needs are fairly simple.

Singularity == nuclear fusion (1, Funny)

Anonymous Coward | more than 8 years ago | (#14982134)

The singularity is going to be like practical nuclear fusion power: always 15-20 years away.

Creativity Machine: it already invented its v2.0 (2, Informative)

Anonymous Coward | more than 8 years ago | (#14982310)

Check out http://imagination-engines.com/ [imagination-engines.com] which is a US company founded by an AI researcher Dr. Stephen Thaler. In summary his systems are composed of paired neural nets in tandem where the first is degraded/excited to produce 'novel ideas' (the 'dreamer') and the second is intended as a 'critic' of the first system's output, or a filter for 'useful' ideas.

In real-life applications, it was used to invent a certain oral-B toothbrush product.

At one time the site's literature announced that 'invention number (CM Creativity Machine) produced invention number 2 (STANNO Self-Training Artifical Neural Object)

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?