Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Handheld Supercomputers in 10-15 Years?

CmdrTaco posted more than 6 years ago | from the oh-yeah-that-i-need dept.

Supercomputing 240

An anonymous reader writes "Supercomputers small enough to fit into the palm of your hand are only 10 or 15 years away, according to Professor Michael Zaiser, a researcher at the University of Edinburgh School of Engineering and Electronics. Zaiser has been researching how tiny nanowires — 1000 times thinner than a human hair — behave when manipulated. Apparently such minuscule wires behave differently under pressure, so it has up until now been impossible to arrange them in tiny microprocessors in a production environment. Zaiser says he's figured out how to make them behave uniformly. These "tamed" nanowires could go inside microprocessors that could, in turn, go inside PCs, laptops, mobile phones or even supercomputers. And the smaller the wires, the smaller the chip can be. "If things continue to go the way they have been in the past few decades, then it's 10 years... The human brain is very good at working on microprocessor problems, so I think we are close — 10 years, maybe 15," Zaiser said."

Sorry! There are no comments related to the filter you selected.

Yes, it will run linux (4, Funny)

140Mandak262Jamuna (970587) | more than 6 years ago | (#21156155)

Before anyone asks. Also you can imagine a beowulf cluster of these, as well as welcome the overlords.

Re:Yes, it will run linux (5, Funny)

JK_the_Slacker (1175625) | more than 6 years ago | (#21156167)

However, these STILL won't run Vista at full speed.

Re:Yes, it will run linux (4, Funny)

jollyreaper (513215) | more than 6 years ago | (#21156373)

However, these STILL won't run Vista at full speed.
You know what the best way to accelerate Vista is? 9.8 meters per second per second.

Re:Yes, it will run linux (0, Redundant)

AmaDaden (794446) | more than 6 years ago | (#21156327)

Is that beowulf cluster in your pocket or are you just happy to see me?

sorry had to say it....

Re:Yes, it will run linux (0)

Anonymous Coward | more than 6 years ago | (#21156383)

Slashdot would be improved substantially by banning everyone using those tired, unfunny memes in a highly tenuous fashion. Also editors who actually exercised editorial control, dupe checking that wasn't a bad joke, getting rid of obvious shills like Roland and idiots like Twitter ...

Re:Yes, it will run linux (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#21156521)

Amen. It's been working for eBaum's World this last week.

Re:Yes, it will run linux (0, Offtopic)

somersault (912633) | more than 6 years ago | (#21156527)

I for one would welcome our new funny memeless overlords

You Stole My Joke.... (0, Offtopic)

StressGuy (472374) | more than 6 years ago | (#21156993)

You Insensitive Clod!!!


Re:Yes, it will run linux (0, Offtopic)

Kamokazi (1080091) | more than 6 years ago | (#21157171)

Yes, but will in Soviet Russia, will it blend me?

Why supercomputers? (5, Insightful)

Ckwop (707653) | more than 6 years ago | (#21156187)

Isn't a super-computer a relative term? I mean, I don't know the exact figure but I would that my Dual Core Intel box at home is probably a good deal faster than a super-computer from the 80s. It is probably hundreds of thousands or perhpas millions of times more powerful than the computers used in the Apollo programme. Surely the measure of what is a super-computer and what isn't must be based upon what the fastest machines are in the world at that time.

Perhaps what he means is that what we currently do with supercomputers today will be able to be done with low cost computing. I can certainly see that being true. In fifteen years, it may be possible to adequately simulate nuclear weapons tests, climate models, or protein folding from a run-of-the-mill desktop.

However, the improvements in computing speed will also apply to super-computers. With that extra power you can run more refined models so I can't see how this could obsolete the traditional bulky super-computer.

In short, I can't really understand the super-computer slant of the article. Why not just talk about general-purpose computing instead?


Re:Why supercomputers? (5, Insightful)

Helios1182 (629010) | more than 6 years ago | (#21156211)

Talking about general purpose computing doesn't make headlines. Thats why.

Re:Why supercomputers? (3, Insightful)

Wiseman1024 (993899) | more than 6 years ago | (#21156581)

Mod parent insightful.

When people don't have news, they make up them. They go and interview anyone who then pulls numbers out of his ass, and thus the "storage technology of the week", "power source of the week", "processing power prediction of the week", etc. is born.

These articles should be considered spam.

Re:Why supercomputers? (5, Funny)

FudRucker (866063) | more than 6 years ago | (#21156295)

you can always tell a supercomputer by the big red "S" on its chest...

Re:Why supercomputers? (1)

StarfishOne (756076) | more than 6 years ago | (#21156577)

I believe it should be a 'G'... That hackers movie I once saw was so realistic, that I now believe that every supercomputer just has to be called 'Gibson'. ;P

Re:Why supercomputers? (5, Funny)

IndustrialComplex (975015) | more than 6 years ago | (#21156297)

Because it doesn't result in as much attention grabbing. If I told you in 15 years, you would have a faster general purpose computer, that wouldn't be newsworthy now would it?

Here are the measurements of my super computer

200,000 Libraries of Congress, or 17 great lakes.
15 Empire state buildings, stacked end to end in a giant circle.
The power consumption of 3 New York Cities.
All the potatoes in Idaho.
Seating for 1.5 747 jumbo jets!
And enough punchcards to circle the moon!

Re:Why supercomputers? (3, Informative)

Anonymous Coward | more than 6 years ago | (#21156329)

More to the point, Supercomputers are not called "Supercomputers" because they are simply faster than other machines. Supercomputers are large-scale vector machines designed for number-crunching capacity. They're great at scientific modeling and simulation, but aren't exactly something all that useful to the average person. (Unless you somehow think that the Cell in the PS3 was the smartest idea ever.)

Also, like most things in computing, "Supercomputer" is a moving target. Today's supercomputers tend to be large clusters of inexpensive machines running OSes like Linux, Mac OS X, or Solaris. (Windows supercomputing clusters probably exist as well, but I doubt that many organizations are willing to pay the software licensing fees.) So unless we can have a 500 processor distributed computing cluster in a Palmtop in 10 to 15 years, I seriously doubt we'll have "a handheld supercomputer". And if you want to go by the supercomputers of yesteryear, technically we already have that power in our handhelds. e.g. An iPhone's SIMD-equipped 625 MHz ARM [] processor could probably hold its own in vector calcs against some of the earlier supercomputer installations.

Sooo.... I call sensationalist headlines. Do I win a prize?

Re:Why supercomputers? (2)

HateBreeder (656491) | more than 6 years ago | (#21157051)

You see, the problem with calling a supercomputer "a cluster in a plamtop" is that there's nothing stopping us from stacking a room full of these "palmtop" devices and making an even larger cluster.

I think the definition of a supercomputer should be changed to something along these line:
"A super computer is any computer which is considered one of the top-N fastest computers in the world today."

Re:Why supercomputers? (0)

Anonymous Coward | more than 6 years ago | (#21156369)

The computer on the Apoolo programme for the first moon landing was 8 Kilobytes.

Re:Why supercomputers? (2, Interesting)

Anonymous Coward | more than 6 years ago | (#21156495)

More specifically, the Apollo Guidance Computer was a 512 kHz (a quad-division of its 2.048 MHz clock) Integrated Circuit Processor with 4 Kilowords of magnetic core memory arranged as 16bit words of 14bit data, 1bit overflow, and 1bit sign. There was only one general purpose register supplemented by four "editing locations" in main memory. Three other registers were accessible for extra information from multiply and divide instructions, and the program counter location. The system was booted from a whopping 32 Kiloword ROM chip made out of core-rope memory.

It was an amazing computer for its time (in some ways it still is), but computers quickly met and surpassed its design, all on a single chip.

Re:Why supercomputers? (4, Interesting)

c (8461) | more than 6 years ago | (#21156427)

> Isn't a super-computer a relative term?


Unless they're talking about something significantly outside the progression we've accepted as Moore's Law. We've come to accept that a super-computer is normally a collection of hundreds of bleeding edge processors. So if they're talking about a handheld ten years from now which is perhaps 1024*(2^(240/18)) times more powerful than a single current bleeding edge CPU, then they could be justified in calling it a super-computer.

They may also be using super-computer to describe a system fast enough that it doesn't need an upgrade to run whatever Carmack pushes out at the time.


Re:Why supercomputers? (1)

sayfawa (1099071) | more than 6 years ago | (#21156429)

I also wondered a while back how powerful my computer is to supercomputers of the past and found using this page [] and a rough conversion to GFLops my desktop is only about as good as a supercomputer from the 80s.

Re:Why supercomputers? (0)

Anonymous Coward | more than 6 years ago | (#21157101)

A modern PC with a dual or quad core CPU and a single or dual 8800 GPUs can deliver a greater GFLOPS rating than any 80s or 90s "supercomputer".

Re:Why supercomputers? (1)

homey of my owney (975234) | more than 6 years ago | (#21156431)

Indeed. In fact I currently have a blazing supercomputer on my desk, if we use the standards for the term 15 years ago. And if we use what the term meant 30 years ago - I have a computer on my desk that, frankly, I think is impossible... Can they really make them to go that fast?

Of course we'll have a super computer on our desk in 15 years. We alway do.

Re:Why supercomputers? (1)

YU Nicks NE Way (129084) | more than 6 years ago | (#21157195)

In fact, the term became functionally useless a few years ago when a second tier computer manufacturer started advertising that its newest product would technically be illegal to ship to certain countries. The various first tier manufacturers had been producing machines with those qualities for years.

Back then, though, we called them "3d graphics cards".

Re:Why supercomputers? (1)

vil3nr0b (930195) | more than 6 years ago | (#21156467)

To me this is how to define supercomputing in today's reality and it will continue to apply. When installing a cluster, supercomputer, etc. take one computer in somebody's house and install it in a rack mountable case with thousands of others. If this guy can make one handheld computer function as thousands it will only make my customers buy thousands of handheld computers. This is why datacenters will be hard pressed to go away.

Re:Why supercomputers? (1)

MM_LONEWOLF (994599) | more than 6 years ago | (#21156499)

because "Handheld Supercomputer!" sounds better on the box than "Relatively small General Purpose Computer."

Re:Why supercomputers? (2, Funny)

hey! (33014) | more than 6 years ago | (#21156531)

Isn't a super-computer a relative term?

No, I think we should insist on a fixed definition of any performance class, which would serve geeks because we could know unambiguously exactly how much computing capacity anybody means when they use a term like "supercomputer". You could even record a conversation and play it back twenty years later, and everybody would know whether we were talking about enough computing power to, say, crack a 56 bit DES key in less than a week.

It would benefit our colleagues in marketing, because coming up with a term for the next generation of practically achievable level of computational power would provide a focus for their frustrated creative energy. Why should all the burden of innovation fall on geeks? Next, our friends the lawyers also benefit, because they'll have a major fight every few years about whether the terms coined by the marketing people have become generic or not. This is a fight which they will eventually lose, providing us with another non-ambiguous, non-proprietary term for a level of computational performance.

Re:Why supercomputers? (1)

Eivind (15695) | more than 6 years ago | (#21156901)

Yeah !

And we could do it like the guys which gave name to spectra, they shared it in: Low, Medium and High-frequency. Simple. Only, there's not really an upper bound on frequency, now is there ? The result was inevitable.

We then got VHF - VERY high frequency.

Then UHF - ULTRA high frequency.

Then SHF - SUPER high frequency.

Then EHF - EXTREMELY high frequency.

The only thing that prevented us from running into SPHF - Stupendously High Frequency was the fact that by this time, we where running into IR-territory.

While the original terms where easy to understand, ask your grandma to sort, "very,ultra,super or extremely" high frequency. I'd never have guessed that "super" is more than "ultra" but whatever.

Anyways, with a fixed def for "supercomputer" we'll end up with a low-end palm-device in a few years classed as a "super-ultra-mega-extreme computer", probably shortened to "SUME-class" :-)

Re:Why supercomputers? (1)

wlad (1171323) | more than 6 years ago | (#21156737)

Exactly my thought... yesterdays supercomputer is today's desktop PC. Supercomputers will still be a lot faster than handhelds, even then :)

Vista's successor will render it useless (1)

Amitz Sekali (891064) | more than 6 years ago | (#21156785)

By the time such computer exist, Vista's successor will use all of that computing power.

Re:Why supercomputers? (1)

garett_spencley (193892) | more than 6 years ago | (#21156849)

Perhaps what he means is that what we currently do with supercomputers today will be able to be done with low cost computing. I can certainly see that being true.

I don't just see that as being "true" .. I see that as "um ... no fucking shit sherlock".

As you already put it, today's PCs ARE super computers relative to the computing power of 10 - 15 years ago. So of course tomorrow's hand-helds will be super computers relative to todays computing power. It's just the way things have gone up until now with no foreseeable change in the trend. It would take a huge roadblock in computing technology development to make it not so.

But then, I didn't RTFA so it is conceivable that I am completely missing the point.

Re:Why supercomputers? (2, Interesting)

A nonymous Coward (7548) | more than 6 years ago | (#21156977)

The super computer I worked on in 1970 was a CDC 6400, came out in 1966, kid brother to the 6600 of 1964. They had a memory cycle time of 1 microsecond for 60 bits, and I think 64K words but I forget exactly. Instructions executed in various times, but the 6600 could pipeline to an extent, call it a 2-3 Mhz machine with 512K of core memory.

$10M or so.

That was the supercomputer of then, and today you can't buy a computer that slow. I don't know what goes in wristwatches these days, but I bet they are faster.

As for 1980-85, those very early PCs were faster in Mhz but didn't do as much per instruction, and didn't have quite that much memory, but they were surely close.

Yeh, this clown will have a handheld 2007 supercomputer in 2022. Big deal. So will everybody. It will be your cell phone / iPod replacement and they will be as ordinary as wristwatches used to be before they fell out of fashion. But there will be faster computers, probably not hand held, and they will be the supercomputers of that day.

Re:Why supercomputers? (1)

Reality Master 101 (179095) | more than 6 years ago | (#21157045)

The clock speed of the legendary Cray 1 was 80 MHz. With two instructions going per cycle, you could theoretically get 160 MFLOPS. These are laughable speeds by today's standards, but back then it was considered unbelievable.

Re:Why supercomputers? (1, Informative)

stonecypher (118140) | more than 6 years ago | (#21157097)

Isn't a super-computer a relative term?
No. Supercomputer is a specific term with a specific speed attached, and has been since the word was coined in the 1970s. The word is backed by law, because of export restrictions. A supercomputer can perform a trillion floating point operations per second (one teraflop,) which was a goal that was difficult at government scale in the 1970s, and is now not all that big a deal. You remember when that North Carolina State professor made a supercomputer out of eight PS3s? He couldn't have done that if "supercomputer" didn't have a rock solid meaning. It's one of those things that only old people seem to know anymore, like that a byte is not necessarily an octet, that bits per second and baud aren't the same thing, or that bandwidth and storage - and indeed everything but ram - is measured base 2 instead of base 10.

Surely the measure of what is a super-computer and what isn't must be based upon what the fastest machines are in the world at that time.
Nope. That would mean that something that's a supercomputer in year 1 might not be in year 2, which would reduce supercomputer to a marketing term. Believe it or not, computers are measurable. Some terms have actual meanings. This is one of those.

I mean, I don't know the exact figure but I would that my Dual Core Intel box at home is probably a good deal faster than a super-computer from the 80s.
Nope. Home PCs will likely cross the teraflop threshhold around 2012. All supercomputers from every era have the same processing threshhold. A current quad-CPU dual core box would be enough.

Perhaps what he means is that what we currently do with supercomputers today will be able to be done with low cost computing.
Nope. He means a teraflop.

However, the improvements in computing speed will also apply to super-computers.
Generally speaking, once a computer has been manufactured, technology improvements do not alter it. There are exceptions, especially in computers which are limited by temperature, but not many. A supercomputer from the 1970s is still a supercomputer today. Please stop attempting to argue with an article on grounds of metaphor structured around words of which you don't know the meaning.

In short, I can't really understand the super-computer slant of the article. Why not just talk about general-purpose computing instead?
The interviewee is old enough to know that supercomputer means something fixed, and that therefore there is a threshhold to be crossed in the fashion of getting a supercomputer into a specific form factor. The interviewer doesn't understand geeks well enough to know that they won't know what a supercomputer is, and fails to explain, probably expecting people to go read the deeply wrong article on Wikipedia. That help?

Re:Why supercomputers? (1)

Lumpy (12016) | more than 6 years ago | (#21157161)

You are 100% correct. we already have handheld supercomputers some of the current subnotebooks are incredibly powerful. Hell we even have write supercomputers by the 1960's definition.


Anonymous Coward | more than 6 years ago | (#21156197)

God bless you, Adolph Hitler


Anonymous Coward | more than 6 years ago | (#21156275)

lol wut

Already here (2, Insightful)

ktappe (747125) | more than 6 years ago | (#21156199)

Today's handheld devices ARE the supercomputers of decades past. Things are always getting faster and smaller. If you took a WinCE device or iPhone back 15 years, you'd blow peoples' socks off.

Re:Already here (1)

Chineseyes (691744) | more than 6 years ago | (#21156407)

Mod parent up this is exactly what I thought when I read the article.

Re:Already here (1)

jackpot777 (1159971) | more than 6 years ago | (#21156959)

If you today's newspaper back 15 years, you'd blow people's socks off.

I know I'd blow the socks off the bookies in Vegas / the High Street (delete as appropriate). Red Sox sweeping the Rockies in the 2007 World Series / Manchester United beating Middlesbrough 4-1 to go top of the league the same weekend / the election winners in France, Argentina, the US mid-terms etc.

I think the computing power thing would pale, compared to the whole time machine gizmo thing and the 1000-1 bets thing. With kowledge like that, you could break the bank.

Re:Already here (1)

stonecypher (118140) | more than 6 years ago | (#21157129)

No, they aren't. A supercomputer can perform a teraflop. That's the definition of supercomputer, and it has been since the word was coined by the government in the 1970s in order to define export restrictions. That's what the article is about: a teraflop in the palm of your hand. That's why that NC State professor was able to cluster eight PS3s and call it a supercomputer. Remember that? He would have been laughed off of campus if supercomputer meant "omg whatever is fast this week."

Nobody cares whose socks are being blown off. Supercomputers from the 1970s are still supercomputers today. It's a specific measurement. Please read a book.

The Not Too Far Future (5, Funny)

eldavojohn (898314) | more than 6 years ago | (#21156201)

10-15 years from always, I'll wake up to my alarm clock, powered by cold fusion. I'll stumble down stairs and get the keys to the hover car from the kitchen and grab my hand held supercomputer. On the way to work, I'll play Duke Nukem Forever as my car flies me along the correct path.

Re:The Not Too Far Future (5, Interesting)

infolib (618234) | more than 6 years ago | (#21156317)

I had a lecturer who explained that when applying for grants you'd always like the research to have imminent application. On the other hand, if you put the deadline too early you, or the people who granted the money, might have to face responsibility for the failure. In between was there was a sweet spot, which he gauged to be around 15 years or so. Ever since then I've honored him by referring to this phenomenon as the "Flensberg Optimum".

Re:The Not Too Far Future (1)

PlatyPaul (690601) | more than 6 years ago | (#21156907)

With 15 years of lead-time, you'd better file the patents right now. Sad, but true.

See also the Friedman Unit (1)

jahknow (827266) | more than 6 years ago | (#21157235)

One Friedman Unit, also known as "one Friedman" or "one F.U.", equals six months. The term is a tongue-in-cheek neologism coined by blogger Atrios (Duncan Black) in reference to the discovery by Fairness and Accuracy in Reporting of journalist Thomas Friedman's repeated use of "the next six months" as the time period in which, according to Friedman, "we're going to find out...whether a decent outcome is possible" in the Iraq War. FAIR cited his use of the phrase as early as 2003. []

Re:The Not Too Far Future (0)

Anonymous Coward | more than 6 years ago | (#21156479)

down stairs
You mean travel tube, of course.

Re:The Not Too Far Future (1)

MM_LONEWOLF (994599) | more than 6 years ago | (#21156517)

And after all these advancements, I bet the 2 highest uses for computers will still be video games and porn.

In the not too far future (-1, Offtopic)

Colin Smith (2679) | more than 6 years ago | (#21156689)

The sun will wake you up, the electricity won't be reliable or cheap enough for you to afford to spend it on an electric alarm clock. You'll stumble down stairs, grab your sickle, go out get on your bicycle and ride to work. You'll spend the day cutting Soya, Willow, Switchgrass, Jatropha or some other energy crop [] because the product is too expensive to spend fuelling farm machinery. There will be hundreds like you in the fields, the cities will be largely deserted, only the rich able to afford to live there.

what about solid state storage advances... (0)

Anonymous Coward | more than 6 years ago | (#21156205)

Sometimes I wonder if in 10 years we will still be using mechanical hard drives.

1-inch multi-terabyte hard drives, but mechanical hard drives never-the-less.

Re:what about solid state storage advances... (1)

Bee1zebub (1161221) | more than 6 years ago | (#21156643)

See []
for more nanotechnology computer components: this time memory.

Of course, magnetic disks are still the best things we have for the hard-drive niche, at least for the next few years, given that only RAM + batteries or flash memory really compete, and both are more expensive, and flash has a much more limited life (counting writes), making it very bad for things like swap partitions or temp files. In the more distant future, I am sure something will replace HDDs, just like RAM chips replaced magnetic drums and HDDs replaced cards and paper tape, but they will be here for some time yet.

Re:what about solid state storage advances... (1)

sm62704 (957197) | more than 6 years ago | (#21156841)

Sometimes I wonder if in 10 years we will still be using mechanical hard drives

I wondered that 20 years ago. I'm still wondering.


10-15 years? (1, Redundant)

porcupine8 (816071) | more than 6 years ago | (#21156209)

Isn't "supercomputer" a bit of a relative term? Don't we have supercomputing handhelds today, if you look at the original supercomputers?

Re:10-15 years? (2, Informative)

maeka (518272) | more than 6 years ago | (#21156339)

A quick google search appears to show modern PDAs [] competing nicely with a mid-80's Cray. []

Pocket Cray-1 (1)

ja (14684) | more than 6 years ago | (#21156685)

You'd like to have a PDA with good double precision (64bit) floating point performance then, which most do not have. But an AMD Geode - as used in the OLPC project - could fit the description.

Re:Pocket Cray-1 (1)

maeka (518272) | more than 6 years ago | (#21156967)

Very solid point as most (all?) the xscale processors do not have a FPU.

Re:10-15 years? (0, Troll)

stonecypher (118140) | more than 6 years ago | (#21157181)

Isn't "supercomputer" a bit of a relative term?
No. A supercomputer is a computer which can perform one teraflop. The origin of the term was to give the US Government a way to set export restrictions on computing hardware. A supercomputer from the 1970s is a supercomputer today. It has nothing whatsoever to do with whatever computers are fast today; the example that people seem to remember is the north carolina state professor who clustered eight PS3s to make a supercomputer. If the term was defined in the context of the speed of its day, why wasn't he laughed off of campus?

The article wouldn't make sense if it was a relative term. "Supercomputer" means one trillion floating point operations per second or better, period.

Define Super Computer (1)

Chris_Stankowitz (612232) | more than 6 years ago | (#21156221)

Is it really a going to be a Super Computer, given that in 10 to 15 years computers that are larger than this one will be will more than likely be much faster? A little sensational really...

10-15 years? (1)

Seto89 (986727) | more than 6 years ago | (#21156225)

10 - 15 years till they are made. 100 - 150 years till they travel back in time, killing everyone named Sarah Connor Will they still run Linux at that point?

Re:10-15 years? (1)

eln (21727) | more than 6 years ago | (#21156649)

Yes, they'll be running Ubuntu Vomitous Vole.

Re:10-15 years? (1)

wed128 (722152) | more than 6 years ago | (#21156891)

i was just wondering...
What are they gonna call the ubuntu that comes after the Zesty Zephyr?

Re:10-15 years? (1)

Fred_A (10934) | more than 6 years ago | (#21157257)

i was just wondering...
What are they gonna call the ubuntu that comes after the Zesty Zephyr?
From what I gathered either the world will end or will have reached the singularity making the issue moot.
Didn't fully convince me either.

Captain obvious to the rescue! (2, Interesting)

Synthaxx (1138473) | more than 6 years ago | (#21156233)

Most of todays cellphones are the super computers of yesteryear. What's really interesting though is what tomorrows super computers will be.

Re:Captain obvious to the rescue! (1)

stonecypher (118140) | more than 6 years ago | (#21157205)

Most of todays cellphones are the super computers of yesteryear.
A supercomputer is a legal term meaning a computer which can perform one teraflop or more. There is no cellular phone (yet) which has crossed that threshhold, and the very first supercomputer made is still a supercomputer today. Please stop attempting to learn your computer science from Wikipedia, as it's written by people whose knowledge is akin to yours.

Hopefully... (1)

Shadow Wrought (586631) | more than 6 years ago | (#21156235)

They will come up with a better name than BrainPal.

Re:Hopefully... (1)

calebt3 (1098475) | more than 6 years ago | (#21156341)

It will be the "Google iImplant"

I am predicting a merger.

"the smaller the wires... (0)

Anonymous Coward | more than 6 years ago | (#21156251)

... the smaller the chip can be".

Isn't it also true that the smaller the wires, the more likely electron migration will be a problem?

Re:"the smaller the wires... (1)

Solra Bizna (716281) | more than 6 years ago | (#21156763)

Not to mention heat dissipation.


Vista slowness -- seriously (3, Insightful)

wonkavader (605434) | more than 6 years ago | (#21156259)

We've already had a joke here saying Vista won't run at full speed, but I think there's a kernel of truth, there.

If you can put a supercomputer in your hand, it's not a supercomputer. A week ago, we had an article here on a guy who'd wired several PS3s together and called it a supercomputer. Folks didn't agree with the supercomputer designation, even though he was getting flops that would clearly have been supercomputer speed just five or six years ago. It's not speed that defines a supercomputer, it's speed relative to what's commonly available.

If we crunch down machines to incredibly small size, then research institutions will buy one 50 times that size. Every time. What will happen is that that tech (if it's not expensive) will drive PC speeds up, perhaps phenomenally, software development tools will make use of the extra speed to make programming easier at the expense of run-time, and we won't see significant speed increases in the user experience. The user will be able to do more, of course, but he'll be complaining "When I speak into the microphone to tell it to write a three page synopsis of this book in it's library, it stalls and lags, and sometimes I tell it twice, before I get a response, and then it gives me two outputs. This thing is SLOW."

We already have handheld supercomputers (3, Interesting)

TrumpetPower! (190615) | more than 6 years ago | (#21156283)

No, really. An iPhone is much more powerful than the Cray-1, and probably significantly more powerful than a Cray X-MP. The iPhone certainly has much more RAM and storage than they typical early Crays; I can’t be bothered right now to find out what kind of MFLOP performance an iPhone has.



Cool... (1)

TechnoBunny (991156) | more than 6 years ago | (#21156287)

...presumably it will be useful to control my flying car.

Nonsense (4, Funny)

93,000 (150453) | more than 6 years ago | (#21156291)

I predict that within 100 years computers will be twice as powerful, 10,000 times larger, and so expensive that only the five richest kings of Europe will own them.

Not Going to happen... (1)

phoenixwade (997892) | more than 6 years ago | (#21156313)

Oh, the processing power will be there... But we will have redefined what a "SuperComputer" is before then, so the term will change before the power gets there.

Am I missing something? (4, Insightful)

jollyreaper (513215) | more than 6 years ago | (#21156321)

Technically, isn't my cell phone a super-computer by the standards of previous generations? Or is it not a matter of processor horsepower but the size of the bus?

The analogy I've seen comparing big iron midrange and mainframes vs. PC's is "Yeah, the PC is zippy, but it's like a ninja bike. The big iron is like a dump truck. The midrange isn't going to get up to speed as quickly but it's going to be doing a hell of a lot more for the effort."

No handheld supercomputers (3, Interesting)

Culture20 (968837) | more than 6 years ago | (#21156377)

We won't have handheld supercomputers ever. If you have a handheld supercomputer, you can have a cluster of them, or better yet, a desktop sized computer so you're not wasting space with screens, batteries, and casings. Until the input/output problem for tiny devices is solved, handhelds will be PDAs and game devices (maybe doing neat things that today's desktops do, but very few will use them to try to crack the latest encryption algorithm).

Power (1)

squoozer (730327) | more than 6 years ago | (#21156391)

Maybe you will be able to hold a machine that matches a current super computers power in your hand in ten years but there is one thing it won't be able to do in your hand - run.

Extrapolating power consumption over the last ten years would seem to indicate that this "super computer in your hand" would probably be glowing red hot. Before we increase computing power much more we need to get a handle on efficiency.

Handheld Supercomputers (1)

MM_LONEWOLF (994599) | more than 6 years ago | (#21156437)

And we still can't find a decent place to get them fixed when they're broke.

Poor premise (1)

bradgoodman (964302) | more than 6 years ago | (#21156483)

Maybe if your PDA used chips that were build of independent die-bonded cores, this would apply. But for any mass-marketed device, the chips are all single-dye devices. This is both much easier to manufacturer (which yields cheaper parts) and the density is much higher.

The real factor here is Moore's Law. When you can put more and more transistors on a single chip/dye, you have two only have a few (basic) options - (using it for more integrated peripherals,) using it for more cache/memory, or using it for adding more cores.

It is arguable which method will yield faster performance/more power for a given application, but no doubt - as just a few years ago multi-processor (core) machines were reserved for "high-end" or "elite" applications - today, basic workstations or even laptops have them.

Moore's law and basic math can tell you how this will (probably) translate into smaller devices.

So..... (1)

The -e**(i*pi) (1150927) | more than 6 years ago | (#21156487)

So how are we going to have thousands of processors in a little PDA, each having the futuristic equivalent of millions of cores, or even quantum cores. And isn't there some law of entropy that will eventually require a certain amount of processing to require a certain amount of energy in order to not go towards less entropy? So in 15 years we will have processors small enough to fit thousands in a small pda, with a building sized liquid helium cooling through superconducting heatsinks, and a small power plant for energy, right?

Ok, this won't work! (0, Offtopic)

laejoh (648921) | more than 6 years ago | (#21156545)

How am I supposed to enjoy pr0n when I have to hold this so called supercomputer in my hands???

The bad thing... (1)

fph il quozientatore (971015) | more than 6 years ago | (#21156559) that all their number crunching programs will be written in Fortran 2015.

10 -15 years away 50 years from now.... (1)

ZonkerWilliam (953437) | more than 6 years ago | (#21156625)

Sounds like Fusion power, but always 10-15 years away!

Been there, done that. (1)

moosesocks (264553) | more than 6 years ago | (#21156635)

The 1970s called. It wants its hype back.

*POOF* (3, Funny)

thatskinnyguy (1129515) | more than 6 years ago | (#21156659)

What was that that just flew by me? Oh yeah! It was the vapor that is this article!

Seventeen years (0)

Anonymous Coward | more than 6 years ago | (#21156779)

They'll patent their invention. People will start using it when the patent runs out. I think a corollary to Murphy's law is in effect here.

There are a lot of things whose patent ran out before people actually started using them. Spread spectrum comes to mind. It was patented circa WW2. Nobody used it for about thirty years. Now we can't live without it.

We have hand held supercomputers now (1)

sm62704 (957197) | more than 6 years ago | (#21156781)

What was a supercomputer when I got my firs computer [] (A Sinclair 1mz w/ 4k memory) is now called a "mobile phone".


Great in Winter (1)

nagora (177841) | more than 6 years ago | (#21156853)

Best handwarmers money can buy. In fact, possibly too good. Oven gloves not included.


I already have one. (1)

saider (177166) | more than 6 years ago | (#21156917)

I already have a supercomputer on my desk, relative to the standards of 10-15 years ago.

Make it stop (1)

boyfaceddog (788041) | more than 6 years ago | (#21156957)

[T]hat will be a huge step for the industry, considering that not so long ago supercomputers filled up enormous rooms or even entire buildings.
Every freakin' time.

Sorry: Bullshit. (1)

WheelDweller (108946) | more than 6 years ago | (#21157027)

See also: flying cars, home grown clothing, and a Democrat who cuts taxes. :)

Think: is a handheld supercomputer going to be cheap? Was the first X-box cheap? How about the first PlayStation 3? If it's not cheap, it's a novelty. And what would you do with that power?

Here's the point: the technology's getting ready to take a jump. But something held in your hand isn't friendly to input, would have (at best) a complex printout. Just try editing your company's mission statement on your cellphone, and you'll see what I mean.

But as a tech-bump? Sure! Why not? But thinking we're going to walk around with 10x the desktop power on a wristwatch is just silly. It doesn't belong there, not yet. Where are the Pentiums and P2's and P3s? Not on our wrists...still on a desk or in a laptop. The form factor doesn't work.

As for tech on the many of us really use this? A full 90%+ of us on the globe use our computers for email, browsing, document prep and playing media. If we could multiply the power of our CPUs 100-fold, what would we do different? Not much. That's what makes Linux so attractive, that and the no-illegality, no virus stance.

Oh, sure- research organizations could farm it out, no doubt. Even local weather-casters could have their own 'models' too. But until Windows2020, no consumer's gonna have a reason to waste that much power, held in their own hand: this is tech-hype.

To be fair... (1)

vertinox (846076) | more than 6 years ago | (#21157037)

My DS is several times more powerful than my old 486sx. (Though still has the same amount of RAM)

Fear for your sanity! (1)

Panitz (1102427) | more than 6 years ago | (#21157063)

Am I the only one who worries this miniscule supercomputing power will be used in a future version of FURBY?!!!

An annoying toy with more intellect than its owner... it'll plot against us all... we'll be overun by a mob of attention seeking robotic creatures that just recite Pi to ten billion decimal places, over and over and over again!!

The horror!

super what (1)

planetfinder (879742) | more than 6 years ago | (#21157083)

As several people have pointed out, the notion of supercomputer is relative. The thing that you can hold in your hand today us usually far less capable than what you can hold in a room today.

With that in mind we can assume that the author is referring to the idea of having something with the power of today's supercomputers in your hand within x number of years. Even with that understanding it seems that today's supercomputers aren't capable of a very useful level of general intelligence and they are not concerned with addressing the technological issues associated with audio and visual interfaces that would help us to avoid automobile accidents and other problems when we are relying on these devices. Only some of the technological problems with audio visual interfaces are related to component density and speed.

Regarding Moore's law as it relates to these issues it is important to realize that doubling your component density and increasing your speed correspondingly is not likely to increase functionality in the same proportion. For many applications that require intelligence in a device it seems that something like the logarithm of the density and speed is the relevant performance measure. Then there is the issue of user interface software technology. Because of these types of issues I'm usually not interested in updating any of my computers unless there is a crudely validated performance factor of at least 2 or unless there is a big improvement in the user interface technology that requires a hardware upgrade. In the case of server technology I can understand getting spun up over a 10% performance increase but for most personal use a factor of at least 2 without a significant change in the user interface seems to result in no noticeable productivity improvement. Has anyone done a study of this ?

Pedantically speaking (1)

zappepcs (820751) | more than 6 years ago | (#21157103)

I have always had trouble with people in the tech industry generally speaking, that refuse to be pedant about their terms and definitions. While it might technically be true that your desktop computer is as powerful as supercomputers of years past, they do not qualify as 'supercomputer' for one reason: The purpose of said computer. Supercomputers are designed to tackle certain problems, or be capable of it. Your desktop machine is designed to be a general purpose machine capable of running .... ughh... Windows. Show me 250,000 dollars worth of hardware designed to run windows and I'll give you the supercomputer on your desktop designation.

The clever use of clustered game controllers does go some way towards 'mini supercomputer' status, but might I suggest we give it another designation? high performance vector computer, high performance RISC computer etc.

When network computing architecture allows for 25000 cells working together across a network to create hitherto unknown FLOPS speeds, perhaps we can come up with other designations... like SkyNet or something.

In the meantime, I leave you with a car analogy:
If you invent an engine that would make a Mazda Miata seem to perform like a fuel dragster, you still would not call it a fuel dragster. Even if you can get 650BHP out of your new miata, it will still not work correctly/well on a 1/4 mile drag strip.

Mind you, I'm all for a super high performance RISC machine embedded in my cellphone just the same.

Wow! Hand held supercomputers! (1)

Charles Wilson (995273) | more than 6 years ago | (#21157111)

Will they let me run my All-In-Wonder video card under Linux?

handhelds vs supercomputer benchmarks (2, Interesting)

suitti (447395) | more than 6 years ago | (#21157117)

I recently picked up a Nokia 770. This device came out a couple years ago, say 2005. In 1985, I worked with a CDC Cyber 205 supercomputer. So, this is really 20 years, not 15. I have benchmark results for both, so why not compare?

The Nokia has 64 MB RAM. The '205 had 16 MB RAM. The Nokia kicks scaler code at about 40 to 100 MIPS. The '205 kicked scaler code at 35 to 70 MIPS. The Nokia has a DSP, which seems to be able to kick about 200 MFLOPS (i could be wrong). The '205 had twin vector pipes with a peak performance of 200 MFLOPS each, but it was rare to get more than 80% of that. My point is that they're comparable. The Nokia came with 192 MB file store, but now has 2.1 GB, and can mount my desktop filesystems over WiFi with better than 1 MB/sec throughput. The '205 had about 1 GB disk, and could mount mag tapes. Both sport native compilers for C, Fortran, etc. The Nokia was about $150. The '205 was about $15,000,000. That's a factor of 100,000 improvement in price/performance. The Nokia runs on batteries and fits in my shirt pocket, with the form factor of my old Palm Pilot. The '205 had a motor-generator power conditioner (the flywheel acts like a battery in power failure) and fit in large machine room with temperature and humidity carefully controlled.

Would i call the Nokia a supercomputer? No. Supercomputers cost more than a million dollars when they are new. Would i build a beowulf cluster of Nokia's? Maybe. With WiFi, one might put together an ad-hoc grid pretty easily. I only have one. But my 4 year old desktop is more than 30 times faster, so it's going to be hard to justify from a pure performance standpoint. Yes, my desktop has better price/performance than the Nokia.

I've not yet run a SETI@Home unit on the Nokia. It'd be much better than the one i ran on the 486/33...

We already have handheld supercomputers (1)

Serhei (1150661) | more than 6 years ago | (#21157185)

600 MHz, fits in your pocket? That's a handheld supercomputer.

The perfect stocking stuffer (1)

PingPongBoy (303994) | more than 6 years ago | (#21157187)

The way I love my computers, a handheld supercomputer is made for incessant fondling.

A we got them B we will never have them (1)

SmallFurryCreature (593017) | more than 6 years ago | (#21157215)

What is correct? Both.

By alrights todays average computers ARE supercomputers, you just have to measure them against the supercomputers of the past.

By that same token you will NEVER have a handheld superocmputer because by simply combining a couple of them together you would have an ever more powerfull computer.

So the article basiclly states, in the future you will have more processing power then you have today. Mmm, yeah, that might happen.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?