Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD, Intel, and NVIDIA Over the Next 10 Years

ScuttleMonkey posted more than 4 years ago | from the what-next dept.

AMD 213

GhostX9 writes "Alan Dang from Tom's Hardware has just written a speculative op-ed on the future of AMD, Intel, and NVIDIA in the next decade. They talk about the strengths of AMD's combined GPU and CPU teams, Intel's experience with VLIW architectures, and NVIDIA's software lead in the GPU computing world." What do you think it will take to stay on top over the next ten years? Or, will we have a newcomer that usurps the throne and puts everyone else out of business?

cancel ×

213 comments

YAY! More Prognostication! (5, Insightful)

newdsfornerds (899401) | more than 4 years ago | (#31320134)

I predict wrong predictions.

Re:YAY! More Prognostication! (2, Insightful)

Pojut (1027544) | more than 4 years ago | (#31320518)

I don't even understand why people do this in an official capacity. I mean, I know they have to for shareholders or business planning purposes or whatever, but these sorts of things are almost always wrong.

Are they just doing it for the lulz?

Re:YAY! More Prognostication! (3, Insightful)

lorenlal (164133) | more than 4 years ago | (#31320644)

I think it's because they're being paid to.

Re:YAY! More Prognostication! (1)

newdsfornerds (899401) | more than 4 years ago | (#31321012)

Come to think of it, people have been paid to write their fantasies for centuries.

Re:YAY! More Prognostication! (5, Informative)

kiwirob (588600) | more than 4 years ago | (#31321026)

I predict they are seriously mistaken in forgetting about ARM processors in their analysis. ARM processors have taken over pretty much all the mobile and a lot of the netbook space. From wikipedia As of 2007, about 98 percent of the more than one billion mobile phones sold each year use at least one ARM processor ARM Wikipedia [wikipedia.org] The world is getting more and more mobile and the desktop processing capacity is becoming irrelevant.

I believe Moore's Law stating the number of transistors will double on an integrated circuit every two years and the continual increase of CPU GPU processing power is a solution looking for a problem. What we need is power efficient processors that have enough processing capacity to do what we need and nothing more. Unless you are a Gamer or doing some serious GPGPU calculations in CUDA or OpenCL what on earth is the need to have a graphics card like the Nvidia GeForce GT 340 with around 380 GFLOPs of floating point processing. It's ridiculous.

Re:YAY! More Prognostication! (2, Informative)

Entropy98 (1340659) | more than 4 years ago | (#31321942)

Your right, besides video games, servers, scientific research [stanford.edu] , movies, increased productivity, and probably a dozen things I haven't thought of what do we need more processing power for?

Of course efficiency is good. Computers have been becoming more efficient since day one.

There is a place for tiny, low power ARM chips and 150 watt 8 core server chips.

The GPU will go the way of the coprocessor (0)

Anonymous Coward | more than 4 years ago | (#31320182)

The GPU will go the way of the coprocessor

Re:The GPU will go the way of the coprocessor (1)

ShadowRangerRIT (1301549) | more than 4 years ago | (#31320302)

Do you mean it will become an integral part of the chip? Or that it will be basically unused by standard compilers and only see use in hand-optimized libraries that are used through APIs? Or both?

Re:The GPU will go the way of the coprocessor (5, Interesting)

Lord Ender (156273) | more than 4 years ago | (#31320370)

The GPU will go the way of the coprocessor

On the contrary, I think the CPU will go the way of the coprocessor. The humble Atom may be enough CPU power for most people these days, but you can never have enough GPU power... at least not until your po-- I mean, games, are photorealistic in real time.

Re:The GPU will go the way of the coprocessor (1)

LWATCDR (28044) | more than 4 years ago | (#31321326)

That depends on the market.
If we are talking about the PC/Laptop market once integrated graphics are good enough at 1080p it is game over. In the workstation market I would agree with you.
Even photorealisim may not be worth the cost. Also that depends on A class games being optimized for the PC and not just console ports.
The way I see the future is this.
nVidia goes for the Supercomputer, workstation, and embedded markets. Their biggest product in numbers and revenue will be the Tegra line if they are lucky.
ATI and Intel duke it out in the X86 market with integrated GPUs and they will be on the die.
Intel and ATI will see their market share shrink in the mass market because they can not complete with ARM+GPU in speed vs cost vs power use. as Smartbooks, tablets, nettops, and Smartphones move more and more into the PC space.
I could be totally wrong and probably am but it is as good as a guess as any.

Re:The GPU will go the way of the coprocessor (2, Funny)

Anonymous Coward | more than 4 years ago | (#31321682)

You just said people will never want photorealistic rendered porn. You have got to be the worst predictor ever.

Have you ever even met a person?

Re:The GPU will go the way of the coprocessor (1)

LWATCDR (28044) | more than 4 years ago | (#31321844)

While I am not a fan of porn I can assure you that almost none of it is currently "rendered" and frankly from what little I have seen the last thing anybody really wants is higher resolution porn.
In that category of video even HD maybe a step too far.

Re:The GPU will go the way of the coprocessor (0)

Anonymous Coward | more than 4 years ago | (#31322342)

Pervert!

Re:The GPU will go the way of the coprocessor (5, Interesting)

ircmaxell (1117387) | more than 4 years ago | (#31320428)

Well... There's 2 ways of looking at it. Either the GPU and the CPU will be merged into one beast, or there will be further segregating of tasks. In terms of price, what's more efficient: Having 1 chip that can do everything (Picture a 128 core CPU, that has different cores optimized for different tasks. So 32 cores optimized for floating point processes, 32 for vector processes and 64 for "generic computing") or having multiple chips that are each fully optimized for their task. Actually, now that I think about it, I'd probably say both. Economy computers would be based off the "Generic" cpu, whereas performance computers and servers would have add-in modules that let you tailor the hardware more towards the task at hand. So the motherboard could get an additional 8 sockets (similar to DIMM sockets) that would let you plug in different modules. So if you need to do graphics heavy processing (video games, movie rendering, etc) you'd add 8 GPU modules to the motherboard. If you needed floating point capacity, you'd add 8 FPU modules... Etc... The advantage of doing it that way over the current PCIe method, is that you get to skip the southbridge (So these modules would have full speed access to system memory, hardware and each other). Of course, there are a lot of hurdles to implementing such a thing...

I am not an engineer, these are just thoughts that rolled off my head...

Re:The GPU will go the way of the coprocessor (0)

Anonymous Coward | more than 4 years ago | (#31320594)

sounds a lot like the cell architecture

Re:The GPU will go the way of the coprocessor (1)

rickb928 (945187) | more than 4 years ago | (#31321540)

As bus speeds go higher and higher, you want components closer together, not further apart.

Change your model to putting specialized components on the die, and youa re on the right track. Unless something like optical busses come mainstream, in which case multiple sockets make some sense. Multipurpose sockets that allow you add vector-specific processing to your system for gaming, vs. I/O or GP processing for servers, vs leaving them empty for the budget.

But I think on-die specialization is the near future. Optical seems a long ways from here relatively speaking.

Re:The GPU will go the way of the coprocessor (1)

ircmaxell (1117387) | more than 4 years ago | (#31321860)

True, but that ignores the economies of production. Right now, most CPUs are available in three flavors (Budget, Mid level, and High End). So if you look at the permutations of adding these specialized modules (100% CPU; 50% CPU 50% GPU; 50% CPU 25% FPU 25% GPU; 50% CPU 50% FPU; etc), you'd have potentially 50 different CPU designs... So the design challenge becomes much more difficult (Both the core design, and the production run and testing design). If you specialized everything, the number of designs for each part drops drastically (So instead of needing 50 models of CPU, you'd only need 3 or 4. Plus 2 or 3 GPU modules. Plus 2 or 3 FPU modules, etc)... The performance (and energy consumption) benefit of pushing everything onto one die would be offset by the reduced difficulty (and hence) of production for each... Again, I'm not saying it's the only way (or even the best), but it's one way that I wouldn't be surprised if we see in the not so distant future...

Re:The GPU will go the way of the coprocessor (1)

rickb928 (945187) | more than 4 years ago | (#31322538)

"If you specialized everything, the number of designs for each part drops drastically (So instead of needing 50 models of CPU, you'd only need 3 or 4. Plus 2 or 3 GPU modules. Plus 2 or 3 FPU modules, etc)... "

Um, right now, you have 50 or so CPU models to accomodate the perceived value of specialization. This impacts choices of CPU speed, caches and their sizes, FSB size, etc.

How this is improved by making 50 or so SKUs of various modules is beyond me. Not to put too fine a point on it, but instead of 50-60 CPU SKUs, you think 50-60 SKUs of various components is a better idea?

Now, the idea of choosing and changing the mix, to add more vector modules and then add more FP modules for other purposes sounds coo,, but honestly, how often do you bother to upgrade your CPU now? Don't you take advantage of system board improvements often?

And we still haven't dealt with the problem of distance. When the FSB starts going over GHz, you will have lots of problems with interconnects. Optical is the future, but it doesn't seem to be in sight yet. So I think this modular concept will play out on-die, not as multisocket options.

If at all. Intel is banging out higher- and higher-speed chips, AMD is getting better fab ability daily, and ARM chips are knocking on the door and want in on the action. This is a great time to be building systems. Let's see, faster or cheaper? Both? Remember the 486 days?

Re:The GPU will go the way of the coprocessor (1)

ianare (1132971) | more than 4 years ago | (#31322516)

This is exactly the sort of progress that will NEVER happen unless Windows is no longer the dominant OS, and to a larger extent proprietary programs in general. With OSS, this sort of thing is just a recompile away. Maybe not for full optimization, but at least to get up and running. For example going from 32 to 64 bit, or x86 to ARM is generally possible with very few (if any) changes. But this is impossible if the source is unavailable.

Mac isn't as bad as Windows for this, since Apple writes the OS and many of the core apps, and has shown to be willing to provide a comparability layer for older software. But with Windows? It'll never happen.

Re:The GPU will go the way of the coprocessor (2, Insightful)

Grishnakh (216268) | more than 4 years ago | (#31320818)

I disagree. Floating-point coprocessors basically just added some FP instructions to a regular single-threaded CPU. There was no parallelism; they just removed the need to do slow floating-point calculations using integer math.

However, GPUs, while they mainly do floating-point calculations, are essentially vector processors, and do calculations in parallel. They can easily benefit from increased size and parallelism: the more parallel processing capability a GPU has, the more realistic it can make graphical applications (i.e. games). And with all the GPGPU applications coming about (where you use GPUs to perform general-purpose (i.e., not graphics) calculations), there's no end to the amount of parallel computational power that can be used. The only limits are cost and energy.

So if someone tried to fold the GPU into the processor, just how much capability would they put there? And what if it's not enough? Intel has already tried to do this, and it hasn't killed the GPU at all. Not everyone plays bleeding-edge 3D games; a lot of people just want a low-powered computer for surfing the web, and maybe looking at Google Earth. An Intel CPU with a built-in low-power GPU works fine for that, but it won't be very useful for playing Crysis unless you think 5 fps is good. People who want to play photo-realistic games, however, are going to want more power than that. And oil exploration companies and protein-folding researchers are going to want even more.

GPUs aren't going anywhere, any time soon. Lots of systems already have eliminated them in favor of integrated solutions, but these aren't systems you're going to play the latest games on. For those markets, NVIDIA is still doing just fine.

Re:The GPU will go the way of the coprocessor (1)

toastar (573882) | more than 4 years ago | (#31321804)

I disagree. Floating-point coprocessors basically just added some FP instructions to a regular single-threaded CPU. There was no parallelism; they just removed the need to do slow floating-point calculations using integer math.

However, GPUs, while they mainly do floating-point calculations, are essentially vector processors, and do calculations in parallel. They can easily benefit from increased size and parallelism: the more parallel processing capability a GPU has, the more realistic it can make graphical applications (i.e. games). And with all the GPGPU applications coming about (where you use GPUs to perform general-purpose (i.e., not graphics) calculations), there's no end to the amount of parallel computational power that can be used. The only limits are cost and energy.

So if someone tried to fold the GPU into the processor, just how much capability would they put there? And what if it's not enough? Intel has already tried to do this, and it hasn't killed the GPU at all. Not everyone plays bleeding-edge 3D games; a lot of people just want a low-powered computer for surfing the web, and maybe looking at Google Earth. An Intel CPU with a built-in low-power GPU works fine for that, but it won't be very useful for playing Crysis unless you think 5 fps is good. People who want to play photo-realistic games, however, are going to want more power than that. And oil exploration companies and protein-folding researchers are going to want even more.

GPUs aren't going anywhere, any time soon. Lots of systems already have eliminated them in favor of integrated solutions, but these aren't systems you're going to play the latest games on. For those markets, NVIDIA is still doing just fine.

Why Can't they dedicate 1/2 the die space to x86 cores and the other half for a big simd processor

Re:The GPU will go the way of the coprocessor (1)

Grishnakh (216268) | more than 4 years ago | (#31322294)

Because die space is expensive. Why would someone with a low-powered notebook or netbook want to spend extra money on 3D graphics capabilities they don't need? And why should someone who's going to buy an NVIDIA card waste money on a CPU that's twice as large as it needs to be, because it has a built-in GPU they'll never use, when they could have double the number of CPU cores instead?

Re:The GPU will go the way of the coprocessor (1)

toastar (573882) | more than 4 years ago | (#31321770)

The GPU will go the way of the coprocessor

The GPU is a coprocessor

The Singularity? (-1, Offtopic)

Singularity42 (1658297) | more than 4 years ago | (#31320188)

With greater personal power, we won't have Microsoft dictating what 3D features we can have. With individuals become supercomputers, these three companies will be out of business. However, personal survivability and power will be sufficient that former employees will be fine.

Re:The Singularity? (4, Insightful)

MobileTatsu-NJG (946591) | more than 4 years ago | (#31320210)

With greater personal power, we won't have Microsoft dictating what 3D features we can have. With individuals become supercomputers, these three companies will be out of business. However, personal survivability and power will be sufficient that former employees will be fine.

What?

Re:The Singularity? (5, Funny)

binarylarry (1338699) | more than 4 years ago | (#31320266)

In short: make your time.

Re:The Singularity? (1)

Yvan256 (722131) | more than 4 years ago | (#31320438)

There's a new GPU company called Zig? Where?!

Re:The Singularity? (0)

Anonymous Coward | more than 4 years ago | (#31320636)

move zig move!

Re:The Singularity? (0)

Anonymous Coward | more than 4 years ago | (#31320660)

move zig move!

Take off, eh.

Re:The Singularity? (0)

Anonymous Coward | more than 4 years ago | (#31321114)

There's a new GPU company called Zig? Where?!

That's a good question. Main screen turn on.

Re:The Singularity? (1)

newdsfornerds (899401) | more than 4 years ago | (#31321194)

Move zig!

Re:The Singularity? (1)

Tibor the Hun (143056) | more than 4 years ago | (#31322628)

Only if it's cubed.

Re:The Singularity? (1)

binarylarry (1338699) | more than 4 years ago | (#31322914)

Now you just sound educated stupid.

x86 and DirectX (1)

Singularity42 (1658297) | more than 4 years ago | (#31320310)

Let's look at x86. It's dominant because of mostly inertia. Millions of chips must be taken into account when deciding whether to try a new architecture. A personal supercomputer--a self modifying brain--will not have limitations on trying new things.

Or look at directX--this is a cooperative library for ensuring games can work with Windows. Personal supercomputers will be able to establish higher level descriptions when interacting with other supercomputers (posthumans). Without burned-in chips or IP (intellectual property will seems silly in this era), it's about the stories you can tell.

Re:x86 and DirectX (1)

MobileTatsu-NJG (946591) | more than 4 years ago | (#31320558)

Okay, so abstraction on hardware with practically unlimited processing power will make NVidia obsolete. There's precedent for that, even if it's a bit off topic. The Wii and the DS.

What you're describing, though, isn't even really on the horizon. In the more near-term, like at least for the next decade, we're still reliant on predictable hardware specs to make games and apps. You talk about inertia keeping the x86 alive. Well of course! This industry thrives on standards. That's how Microsoft can 'dictate 3D features we have'. I'd normally stand right next to you and light my torch, but the 3d apps I rely on to put the roof over my head still use an old version of OpenGL and only give me 8 lights to play with in real-time. I'm not sure DirectX is exactly what I want there, but I do wish a bigger entity would make these guys pull these apps into the 21st century.

Re:x86 and DirectX (1)

MobileTatsu-NJG (946591) | more than 4 years ago | (#31320618)

Sorry to reply to my own post, but I was a bit hasty in my rant. I'm thinking about a couple of apps in particular and not about all the apps. Blender's viewport kicks ass in ways that make me envious and 3D Studio MAX isn't far behind. I just wanted to acknowledge this before I get my well-deserved roast. :D

Re:x86 and DirectX (0)

Anonymous Coward | more than 4 years ago | (#31321754)

Or look at directX--this is a cooperative library for ensuring games can work with Windows. Personal supercomputers will be able to establish higher level descriptions when interacting with other supercomputers (posthumans). Without burned-in chips or IP (intellectual property will seems silly in this era), it's about the stories you can tell.

What?

Re:The Singularity? (1)

VGPowerlord (621254) | more than 4 years ago | (#31320688)

With greater personal power, we won't have Microsoft dictating what 3D features we can have.

Actually, usually it's nVidia or ATI dictating what 3D features we have, with the other immediately implementing the same thing to keep up.

Re:The Singularity? (1)

maxume (22995) | more than 4 years ago | (#31320770)

When singularity come, electronics end up all over.

ARM (3, Insightful)

buruonbrails (1247370) | more than 4 years ago | (#31320224)

All three will be marginalized by the ARM onslaught. Within 10 years, smartphone will be the personal computing device, AMD and Intel processors will power the cloud.

Re:ARM (2, Insightful)

ShadowRangerRIT (1301549) | more than 4 years ago | (#31320350)

I really, really hope you're wrong. Forced to choose between a smartphone and nothing at all, I'd likely go with nothing. Which would be professionally problematic, since I code for a living.

Re:ARM (3, Insightful)

h4rr4r (612664) | more than 4 years ago | (#31320560)

So you could get an arm laptop or x86 workstation. For work use thinclients will be popular again soon and many people will use a smart-phone, hooked to their tv for display when at home, instead of a home computer.

Then the cycle will restart. Welcome to the wheel of computing.

Re:ARM (4, Interesting)

Grishnakh (216268) | more than 4 years ago | (#31320988)

And why would they bother with that, when they can simply have a separate computer at home instead of having to worry about dropping theirs and losing everything?

PCs aren't going anywhere, and the idea that they'll be replaced by smartphones is utterly ridiculous. Despite the giant increases in computing abilities, and the ability to fit so much processing power into the palm of your hand, even mainframe computers are still with us; their capabilities have simply increased just like everything else. Why limit yourself to the processing ability that can fit into your hand, if you can have a desktop-size computer instead, in which you can fit far more CPU power and storage? Today's smartphones can do far more than the PCs of the 80s, but we still have PCs; they just do a lot more than they used to.

Of course, someone will probably reply saying we won't need all the capability that a PC-sized system in 20 years will have. That sounds just like the guy in the 60s who said no one would want to have a computer in their home. A PC in 2030 will have the power of a supercomputer today, but by then we'll be doing things with them that we can't imagine right now, and we'll actually have a need for all that power.

Re:ARM (1)

anexkahn (935249) | more than 4 years ago | (#31321480)

I think it will be a mix of the two. For some a current netbook does everything they need. If a phone came to have more horse power than current netbooks, we will see some users using only the one device. However, there will always be the users that need as much processing as they can get.

Re:ARM (1)

Grishnakh (216268) | more than 4 years ago | (#31321740)

For some a current netbook does everything they need.

For some, but not very many. Most people I imagine have them in addition to their other computing devices, and mainly use the netbooks when they're away from the home or office because it's small and easily portable. The problem with the netbook is that the screen and keyboard are much too small for anything besides web surfing (and even that's not that great because of the tiny screen, but at least it's better than a smartphone).

Sure, you could get a "docking station" and plug in a larger monitor and keyboard, but docking stations for laptops have never done that well either, and for the price, you might as well just get a cheap desktop computer, as you'll certainly spend as much. In the future, we can probably (if MS doesn't hold us back too much) look forward to much better synchronization between different computing devices, so it would seem pointless to try to use the same CPU for many different applications, when you can just have separate computers. Plus, what kind of fool would want to risk losing all their important data when they accidentally drop or sit on their netbook? Obviously, you want to keep multiple computers around, so that you can still get work done and access the web when one of them goes down. And, with $/MIPS prices always falling, it's senseless to not have multiple CPUs in multiple devices. It'd cost more to buy a special docking station than to just have a second computer.

Re:ARM (1)

Sir_Sri (199544) | more than 4 years ago | (#31321528)

I dunno, I can imagine a 3D gaming tapestry taking up half my wall, and I can imagine redundant computing boxes in the house for my kids (whom I haven't had yet, and must presume they will live at home in 2030) and their half wall gaming/TV tapestries. And my girlfriend will probably want something to browse her recipes and listen to music on, not to stereotype mind you, he computer can do a lot more than that now, she just seems to only care about recipes and listening to music.

A computer that costs a billion dollars to do today, 20 years from now it will presumably cost in the small number of thousands. If I had a billion dollars in computing today I can think of a whole lot of really cool, albeit frivolous things I would do.

Re:ARM (1)

LUH 3418 (1429407) | more than 4 years ago | (#31321890)

I agree, but I think an even more important factor is the interface. Sure, your cellphone is could have enough computing power to run most of the applications you use on a regular basis, but... How fast can you type on it? How comfortable is it to do that for extended periods of time? What about all the students doing assignments, all the people writing reports and making spreadsheets? How comfortable do you feel working with such a tiny screen? The fact that cellphones have to be small to be portable limits their usability in important ways.

Alot of these limits could be fixed, say, by having a small docking port (perhaps even wireless docking) with external I/O devices, monitors, or TVs. But then, if you're going to have a device to hook your cellphone to at home so you can use it as a desktop, how much more expensive would it be to add a CPU, some RAM and a hard drive to that thing, in order to make it... A full-fledged desktop computer.

Re:ARM (1)

kimvette (919543) | more than 4 years ago | (#31322570)

A PC in 2030 will have the power of a supercomputer today, but by then we'll be doing things with them that we can't imagine right now, and we'll actually have a need for all that power.

Of course we will; Microsoft will still be producing operating systems around Windows architecture, so we will need the usual CPU-hogging Windows+antispyware+antivirus stack, and IE6 will still be in heavy use in the enterprise. ;)

Re:ARM (0)

Anonymous Coward | more than 4 years ago | (#31322058)

"With every mistake we must surely be learning.."

Is there a way to get off this "wheel" you speak of?

Re:ARM (1)

Jeppe Salvesen (101622) | more than 4 years ago | (#31320586)

I suspect we might be connecting our smart-phones to a monitor (or some other display technology), and using a wireless keyboard. When programming, or doing other really serious stuff.

Re:ARM (3, Interesting)

Angst Badger (8636) | more than 4 years ago | (#31320788)

All three will be marginalized by the ARM onslaught. Within 10 years, smartphone will be the personal computing device, AMD and Intel processors will power the cloud.

ARM may well come to dominate personal computing, but it sure won't be via the smartphone. No one is going to edit long word processor documents on their phone, much less edit spreadsheets, write code, or do much else that qualifies as actual work. And it's not because they don't already -- in many cases -- have enough processor power; it's because they don't have full-sized keyboards and monitors. I'll grant that it's possible that phones or PDAs of the future might well attach to full-featured I/O devices, but by themselves, no.

The cloud, too, has some significant limits that it will be difficult if not actually impossible to overcome. Security is a major issue, arguably theoretically resolvable, but trusting your critical data to an outside company to whom you are, at best, a large customer is not.

Re:ARM (1)

h4rr4r (612664) | more than 4 years ago | (#31320906)

Wireless connectivity to an HDTV for dislplay and bluetooth for keyboard and mouse.

Re:ARM (1)

Grishnakh (216268) | more than 4 years ago | (#31320872)

Exactly. And programmers will all be writing their code on a smartphone with a touchscreen, and no one will have a monitor larger than 4" diagonal. 10 years from now, desktop computers will be gone.

Re:ARM (1)

dunkelfalke (91624) | more than 4 years ago | (#31321002)

You mean like all those Java network PCs you are seeing everywhere now?

How do you call a smartphone (1)

ElusiveJoe (1716808) | more than 4 years ago | (#31322736)

...which has a full-sized keyboard, a mouse and a huge display? I bet you call it 'a computer' or 'PC'. Because editing documents, watching videos, even web surfing on a 5 inch display with a micro keyboard sucks.

Re:ARM (1)

lotho brandybuck (720697) | more than 4 years ago | (#31322800)

Isn't NVIDIA making an ARM now (Tegra?) Is anyone using it yet?

I have to take a walk get some fresh air and pizza. I almost had a seizure reading TFA. I probably would've if I'd had a better GPU!

sa (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31320238)

Program [saudilayer.net]
crack [saudilayer.net]
Internet [saudilayer.net]
Networks [saudilayer.net]
Games [saudilayer.net]
uploed [saudilayer.net]

Bitboys (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31320284)

Bitboys!

Haven't you heard? Studies have shown.. (0, Redundant)

introspekt.i (1233118) | more than 4 years ago | (#31320292)

The use of futurism has been thoroughly discredited.

Re:Haven't you heard? Studies have shown.. (4, Funny)

Anonymous Coward | more than 4 years ago | (#31320324)

The use of futurism has been thoroughly discredited.

I predicted that years ago.

Re:Haven't you heard? Studies have shown.. (1)

Pojut (1027544) | more than 4 years ago | (#31320570)

I find it funny when Science Channel has one of those "Future" shows on, and you get some asshole talking into the camera with the little caption under his name pegging him as a professional "futurist".

He gets paid to make wild guesses. ::golf clap::

Re:Haven't you heard? Studies have shown.. (2, Insightful)

maxume (22995) | more than 4 years ago | (#31320712)

If he is getting paid well, he doesn't give two shits what kind of clap you are doing.

Re:Haven't you heard? Studies have shown.. (1)

Pojut (1027544) | more than 4 years ago | (#31320972)

Lol, good point. Reminds me of a conversation that took place at our dinner table last night:

Wife's Uncle: "I wonder how Michael Phelps can swim for a living. Doesn't that get boring?"
Me: "Sure...if you consider it boring to be a millionaire."

VLIW (1)

Citizen of Earth (569446) | more than 4 years ago | (#31320304)

It seems like VLIW is a bit like nuclear fusion -- in ten years, people will still be talking about how its practical realization is ten years away.

Re:VLIW (1)

AvitarX (172628) | more than 4 years ago | (#31320432)

Hasn't ARM been doing this for ages?

I swear I read a review of the netwinder appliance, and that it the review mentioned it a cheap way to start hacking with a VLIW architecture.

This was many years ago, and it was essentially an office file-server appliance.

Re:VLIW (1)

Svartalf (2997) | more than 4 years ago | (#31321348)

ARM's not VLIW.

Re:VLIW (1)

oldhack (1037484) | more than 4 years ago | (#31320542)

Let's see... do I remember reading about this back in the 80s as the next generation CPU architecture? In Byte magazine, I think it was...

Too much hyperbole... (5, Insightful)

Foredecker (161844) | more than 4 years ago | (#31320548)

You can always spot a sensationalist post when part of it predicts or asks who will go out of business. Or what thing will disappear.

For example, in his post, ScuttleMonkey [slashdot.org] asks this:

...Or, will we have a newcomer that usurps the throne and puts everyone else out of business?

NNote, the post is a good one - Im not being critical. But change in the tech industry rarely result in big companies going out of business - if they do, it takes a long time. I think sun is the canonical example here. It took a long time for them to die - even after many, many missteps. Sun faded away not because of competition or some gaming changing technology, but simply because they made bad (or some would say awful) decisions. Same for Transmeta.

People have been predicting the death of this or that forever. As you might imaging, my favorite one is predicting Microsofts death. Thats being going on for a long, long time. The last I checked, we are still quite healthy.

Personally, I dont see Intel, AMD, or NVIDIA ding any time soon. Note, AMD came close this last year, but they have had several near death experiences over the years. (I worked there for several years...).

Intel, AMD and NVIDIA fundamental business is turning sand into money. This was a famous quote by Jerry Sanders the found of AMD. Im paraphrasing, but it was long the idea at AMD that it didnt matter what came out of the fabs as long as the fabs were busy. Even though AMD and NVIDIA no longer own fabs, this is still their business model (more or less).

I think its interesting how a couple of posters have talked about ARM - remember, AMD and NVIDIA can jump on the ARM bandwagon at any time. Intel already is an ARM licensee. Like AMD, they are in the business of turning sand into money - they can and will change their manufacturing mix to maintain profitability.

I also dont see the GPU going away either. GPUs are freakishly good at what they do. By good - I mean better than anything else. Intel flubbed it badly with Larabee. A general purpose core simply isnt going to do what very carefully designed silicon can do. This has been proven time and time again.

Domain specific silicon will always be cheaper, better performing and more power efficient in most areas than a general purpose gizmo. Note, this doesnt mean I dislike general purpose gizmos (like processors) - I believe that the best system designs have a mix of both - suited to the purpose at hand.

-Foredecker

Re:Too much hyperbole... (2, Funny)

Anonymous Coward | more than 4 years ago | (#31320796)

we [Microsoft] are still quite healthy

Oh noes! Who let him post on slashdot?!

Re:Too much hyperbole... (1)

Twinbee (767046) | more than 4 years ago | (#31320864)

Although the Larabee may not be as cost/energy/speed efficient as a dedicated GPU, perhaps it will be almost as fast, but easier to program for, thus ensuring its popularity...

Re:Too much hyperbole... (1)

Foredecker (161844) | more than 4 years ago | (#31321366)

The intial Larrabee product was canceled. [anandtech.com] Intel had to re-trench on the graphics plans. Again. They are a smart company, but they have struggled to get off the ground in the graphics area. Im assuming they will someday be successful. But today isnt that day.

-Foredecker

Re:Too much hyperbole... (1)

mako1138 (837520) | more than 4 years ago | (#31322222)

NVIDIA's already got an ARM platform: Tegra.

Parallel Computing: Both CPU and GPU Are Doomed (0, Flamebait)

rebelscience (1717928) | more than 4 years ago | (#31320552)

Both CPU and GPU Are Doomed [blogspot.com]

Unless those big dogs wake up soon from their stupor, an unknown startup will sneak behind them and steal their pot of gold.

Re:Parallel Computing: Both CPU and GPU Are Doomed (1)

Foredecker (161844) | more than 4 years ago | (#31320578)

That is what Transmeta thought. Intel proved more agile than they predicted. AMD, Intel and NVIDIA can move faster than people think. i suggest that it is their market to loose, not others to win. -Foredecker

Re:Parallel Computing: Both CPU and GPU Are Doomed (1)

nxtw (866177) | more than 4 years ago | (#31321020)

AMD, Intel and NVIDIA can move faster than people think

I disagree. It seems CPUs and GPUs are designed and planned well ahead of time. Tapeout [wikipedia.org] occurs many months before products hit the market. Intel's Sandy Bridge apparently taped out in June 2009 [fudzilla.com] and won't be released until 2011. Yonah taped out in October 2004 [theinquirer.net] but wasn't released until January 2006. If it appears that these companies are responding quickly with new, competitive designs, it's because they correctly predicted the market direction and planned accordingly.

The companies can only really move fast in adjusting pricing, marketing, availability, and SKUs.

Re:Parallel Computing: Both CPU and GPU Are Doomed (2, Insightful)

Foredecker (161844) | more than 4 years ago | (#31321586)

Yes - it takes about two years (or more) to go from a white board to first silicon. Until I worked at Microsoft, I worked at hardware and silicon companies. But remember, the competition to Intel, AMD and NVIDIA will be other silicon companies - not software companies. The new compitetion will have the same constraints. This is also a small industry - its very difficult to do someting both major and new in secret. When I was at AMD, we knew about Transmeta's plans when they were still in stealth mode. It wasn't because of anything nefarious - the community is small and leaky. -Foredecker

Both CPU and GPU Are Doomed (1)

rebelscience (1717928) | more than 4 years ago | (#31321052)

Both CPU and GPU Are Doomed [blogspot.com]

Why was I downvoted as flamebait? Who did I flamebait? Are some moderators on Slashdot working for Intel, AMD or Nvidia?

Big Dog Censorship (0)

Anonymous Coward | more than 4 years ago | (#31321956)

That's why your comment was downvoted as flamebait. They don't like your comment, so they shut you out. Censorship from the big dogs, that's all.

This is GPU-only, the less interesting question. (1)

Tumbleweed (3706) | more than 4 years ago | (#31320664)

The article is about the GPU business, which, IMO, is the less interesting aspects of these companies. The new hotness is all in the mobile space. You want interesting, read about the upcoming Cortex-A9-based battle going on for 'superphones', smartbooks and tablets.

People have been predicting the death of the desktop computer for a long time, but the problem is that there hasn't been anything realistic to replace it. We're within sight of that replacement. Once everyone has a superphone that can do the bulk of what people normally do (either when in mobile mode, or when docked to a nice monitor/keyboard/mouse at home/work), THEN (and only then) will the desktop PC fade away for the majority of people's home use.

I give it 2 years before this is practical, though of course, the ecosystem to support such a change is the hard part. Can you get access to all your data, etc., from your docked mobile? That's gonna be the key. What about when you want to take a call at home and use your docked mobile as a computer?

You want to make money - solve these problems now.

Re:This is GPU-only, the less interesting question (1)

rezalas (1227518) | more than 4 years ago | (#31321010)

Mobile phones don't have the shelf life required to replace the desktop. I like having my PC at home simply because it cannot easily break like a phone can. I don't worry about getting hit in the holster with a door from a careless worker and shattering my touchscreen on my home PC, and I don't worry about someone crushing it when they carelessly sit on a table that my PC rests on. Everyone I know with a smart phone replaces it every couple of years - but I have desktops that last 5 years before I move it to a test bench position or give it to a nephew as an upgrade for him ( even then, I only replace it for gaming purposes and not due to any hardware failures).

Re:This is GPU-only, the less interesting question (1)

Tumbleweed (3706) | more than 4 years ago | (#31321104)

Mobile phones don't have the shelf life required to replace the desktop.

This would be part of that ecosystem mention I made.

If you dock your mobile phone to access the 'big' data (videos, music, etc.), and other stuff is in the cloud, then replacing the phone itself isn't going to be a big deal, and with cell providers going the hwole 'go with us for 2 years and get your phone for half price' thing, that will incentivize people to upgrade their shit more often, which, from a web developer's perspective, I find quite nice. :)

Re:This is GPU-only, the less interesting question (1)

rezalas (1227518) | more than 4 years ago | (#31321272)

True, but cloud computing has a serious obstacle to overcome in this area - the ISP. Every ISP is working on a way to charge per gig per month right now (I know, I work for one that actually has been doing it in the central US for over 3 years). ISPs are one of the greatest threats to tech like this simply because they want to find a way to make more money off providing the same service which is slowly choking the customer out of wanting to use the internet.

whomever can perfect low-power computing (1)

peter303 (12292) | more than 4 years ago | (#31320710)

Batteries limit mobile devices. Heat limits supercomputer servers.
The first low power CPUs like the Atom were lame. Better devices on the way.

Re:whomever can perfect low-power computing (1)

Nadaka (224565) | more than 4 years ago | (#31321094)

Since when was the Atom the first low power device? ARM was earlier, uses less power and has much better performance/watt.

Predictions (1)

Kjella (173770) | more than 4 years ago | (#31320824)

Consoles come out with 1080p/DX11-class graphics. Graphics cards for the PC try to offer 2560x1600+ and whatnot but the returns are extremely diminishing and many current PC games will come to a console with keyboard and mouse. GPGPU remains a small niche for supercomputers and won't carry the cost without mass market gaming cards. The volume is increasingly laptops with CPU+GPU in one package and there'll be Intel CPUs with Intel GPUs using an Intel chipset on Intel motherboards - and they'll be a serious player in the SSD market too. AMD will do the same but suffer from being behind on manufacturing process and continue to struggle but survive like Macs do in a Windows market. nVidia will lose their way if they haven't already lost it, everything they've said so far about Fermi makes me think they're heading down a dead-end street. No i7/i5 motherboards with nVidia chips, new Atoms which kill ION, nVidia is being forced to go discrete in a market that is increasingly more integrated. The good new for them is that Intel will continue to flop on 3D performance including Larrabee so there'll be a market for discrete cards a little longer.

Article--good; adverts, hilarious (1)

Skratchez (1304839) | more than 4 years ago | (#31320830)

The (I assume automatically targeted) ads were awful. They had one for an add-on sound card on the second page of the article, "The Death of the Sound Card". Lovely pictures and remembrances of old stuff though; I still remember my first 3dfx Voodoo, and sending the daughter-card back for a Monster when I found out the frame rate in GLQuake was weak sauce. Good times.

Improved power consumption, multi stack machines (1)

jdhenshaw (660983) | more than 4 years ago | (#31320946)

(1) Low power consumption that avoids the use of a traditional clock. (2) Stack machine architecture - produces dense code. (3) Multiple stacks with shared hardware and compiler support.

At some point, the GPU goes on the CPU chip (4, Informative)

Animats (122034) | more than 4 years ago | (#31321044)

At some point, the GPU goes on the CPU chip, and gets faster as a result.

Maybe.

GPUs need enormous bandwidth to memory, and can usefully use several different types of memory with separate data paths. The frame buffer, texture memory, geometry memory, and program memory are all being accessed by different parts of the GPU. Making all that traffic go through the CPU's path to memory, which is already the bottleneck with current CPUs, doesn't help performance.

A single chip solution improves CPU to GPU bandwidth, but that's not usually the worst bottleneck.

What actually determines the solution turns out to be issues like how many pins you can effectively have on a chip.

Re:At some point, the GPU goes on the CPU chip (0)

Anonymous Coward | more than 4 years ago | (#31322122)

More importantly GPUs do not need low latency memory since they are extremely parallel with always far more work ready to be processed than processors available.
For CPUs, memory latency is important and caches critical, for GPUs memory bandwidth is absolutely critical and everything else mostly to make them work nice for compute applications and other corner-cases.

Still waiting for OpenCL (2, Insightful)

janwedekind (778872) | more than 4 years ago | (#31321046)

I am still waiting for OpenCL to get traction. All this CUDA and StreamSDK stuff is tied to a particular company's hardware. I think there is a need for a free software implementation of OpenCL with different backends (NVidia-GPU, AMD-GPU, x86-CPU). Software developers will have great difficulties to support GPUs as long as there is no hardware-independent standard.

Think of the patents! (1)

dthirteen (307585) | more than 4 years ago | (#31321314)

A patent dies everytime you mentaly masterbate on slashdot. Oh the horror.

newcomer (1)

HyperQuantum (1032422) | more than 4 years ago | (#31321342)

will we have a newcomer that usurps the throne and puts everyone else out of business?

Unlikely.

After all, any newcomer would have to buy licenses for all those patents owned by the existing companies. That's a lot of money, and it isn't even spent for realizing the newcomer's innovative and great idea. That requires another big load of money. Good luck finding enough investors for your business.

ATI lacks support for most recent Xorg (0)

Anonymous Coward | more than 4 years ago | (#31321420)

ATI will need to step up it's Linux driver development. It appears that certain flavours of Linux have been scrubbed from the list they are willing to support (namely Fedora).

http://ati.cchtml.com/show_bug.cgi?id=1696

Nvidia, on the other hand, is doing well with providing 3D drivers across the board. Users have the option of using either the proprietary drives from Nvidia or the new open source Nouveau drivers in the kernel.

I predict (0)

ClosedSource (238333) | more than 4 years ago | (#31321788)

That any processor company that is able to find a technology to significantly boost the performance of a single-core processor will wipe the floor with competitors who are still relying on the multi-core performance kludge.

ridiculous quotation (0)

Anonymous Coward | more than 4 years ago | (#31321892)

From the nvidia page:

After all, we’re not seeing any proprietary 3D graphics APIs in practice anymore.

Yay, finally OpenGL has put the last nail in the coffin of Directwhateverit'scalled.

what's interesting to me... (4, Interesting)

buddyglass (925859) | more than 4 years ago | (#31322022)

What I find interesting is the overall lack of game-changing progress when it comes to non-3d-or-hd-video-related tasks. In March 2000, i.e. ten years ago, top of the line CPU would be a Pentium III coppermine, potentially topping out around 1 Ghz. I could put Windows XP on one of those (with enough RAM) and do most office / browsing tasks about as fast as I could with today's top of the line CPU. Heck, it would probably handle Win7 okay. Contrast the period 2000-2010 with the period 1990-2000. In 1990 you would be looking at a 25mhz 486DX.

Re:what's interesting to me... (2, Interesting)

yuhong (1378501) | more than 4 years ago | (#31322648)

In 1990 you would be looking at a 25mhz 486DX.

Which is the minimum for most x86 OSes nowadays. In fact, some newer x86 OSes and software have even higher requirements. Windows XP and SQL Server 7.0 and later for example require the CMPXCHG8B instruction, and Flash 8 and later require MMX.

None of them will exist in 10 years (0)

Anonymous Coward | more than 4 years ago | (#31322814)

in 2012 a huge solar flair will hit the earth disabling all electronic equipment, sending the world into a Mad Max like landscape where we battle for fuel to run some bad ass looking off road monsters. Intel, Amd, Nvidia will be the names of our grand children's children.
Meet my boy Athalon XP, he is much smarter then your Intel boy, but sort of a hot head ;-P

Open drivers (1)

ElusiveJoe (1716808) | more than 4 years ago | (#31322828)

I just wish that AMD will finish its open-source drivers and ditch the infamous proprietary ATi legacy. Than Nvidia will have no other choice than to open their drivers or go bankrupt. I know, I am a dreamer.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...