Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

IBM's Plans For the Cell Processor

Soulskill posted about 4 years ago | from the breeding-a-better-hamster dept.

IBM 124

angry tapir writes "Development around the original Cell processor hasn't stalled, and IBM will continue to develop chips and supply hardware for future gaming consoles, a company executive said. IBM is working with gaming machine vendors including Nintendo and Sony, said Jai Menon, CTO of IBM's Systems and Technology Group, during an interview Thursday. 'We want to stay in the business, we intend to stay in the business,' he said. IBM confirmed in a statement that it continues to manufacture the Cell processor for use by Sony in its PlayStation 3. IBM also will continue to invest in Cell as part of its hybrid and multicore chip strategy, Menon said."

cancel ×

124 comments

Sorry! There are no comments related to the filter you selected.

Newsflash: (0)

Anonymous Coward | about 4 years ago | (#33866366)

We're going to continue working.

Whats a Future Power Road Map? (-1, Troll)

furgle (1825812) | about 4 years ago | (#33866432)

also

Can anyone explain this to me, is there anything significant to Cell processors? Or is it a new (or old [I tend to miss these things]) buzz word or technology? Is it an IBM branding ? Is it a new type of processor that works by decoding DNA ?

I read the article, and all I read was a bunch of words with very little meaning.

Re:Whats a Future Power Road Map? (0)

Anonymous Coward | about 4 years ago | (#33866496)

also

Can anyone explain this to me, is there anything significant to Cell processors? Or is it a new (or old [I tend to miss these things]) buzz word or technology? Is it an IBM branding ? Is it a new type of processor that works by decoding DNA ?

I read the article, and all I read was a bunch of words with very little meaning.

Power is the architecture for IBM's CPUs. Hence PowerPC, POWER5, etc. It's more of a family of chips than a particular ISA or core design.

A "Road Map" is a published plan for product releases, usually with only general features and dates described in quarters. It's mostly marketing, and only slightly better than vaporware.

The future is the time that follows the present.

Re:Whats a Future Power Road Map? (4, Interesting)

Nursie (632944) | about 4 years ago | (#33866576)

You've really missed hearing about Cell?

It's a new processor architecture, IBM and Sony (and possibly others) had a hand in it. Effectively two "Power" cores and a bunch of vector processing units. It's supposed to be very very good for vector operations. For a while (a few years back now) the world's most powerful supercomputer was a machine composed of nodes containing two cell processors and an Opteron each.

It's different to other parallelisation strategies as the vector units (SPU/SPEs) allow you to parallelise stuff at an operation level, unlike just stuffing more cores into the box which is the intel/PC strategy. For games and graphics this it thought to be good, hence its inclusion in the playstation 3. It's also supposed to be good for scientific computing.

I guess you could think of it as somewhere between a CPU and a GPU, or a hybrid of the two approaches.

Re:Whats a Future Power Road Map? (2, Insightful)

keeboo (724305) | about 4 years ago | (#33866668)

So, shortly:
Cell is a processor with two PPC cores, interfaced with a bunch of auxiliary CPU cores optimized for SIMD, each with its local memory.
Right?

Re:Whats a Future Power Road Map? (2, Insightful)

Nursie (632944) | about 4 years ago | (#33866804)

On further reading - not two PPC cores, one core with two threads using a similar (but possibly superior) technology to hyperthreading.

But yeah, essentially your short description there is correct.

Also I've looked at the top 500 list - The cell, though not the variant in the playstation, is in Roadrunner. Roadrunner is the third fastest computer on the planet.

Re:Whats a Future Power Road Map? (0)

Anonymous Coward | about 4 years ago | (#33867204)

A bit like that in effect, although the architecture is quite different from a GPU. GPUs are aggressively SIMD, literally doing the same operation on several data items at the same time. If you have to branch, then the GPU does one pass of the branched bit for each branch (called a 'Split Warp' in CUDA-speak). Operations with fine grained branching at levels lower than an individual warp will reduce the efficiency of the GPU.

The cell had 8 baby CPU elements with 256k of local memory (called Synergistic Processing or SP elements). Peak floating point speed was somewhat slower than contemporary GPU chips like the GeForce 8 family, but cells don't have the 'split warp' problem. This gives them some strengths over GPUs for certain types of computations but the 256k local memory limit constrains the amount of data that the SP has access to at any given time. The cell is probably closer to a traditional DSP than a GPU.

GPUs have fairly clearly won the war for mindshare in the PC-based vector bashing market and are much, much cheaper than any commercial Cell based product except the PS3. If the article is to be believed, IBM are rolling the technology into their mainstream Power 8 products, which will give them very good floating point performance directly on-chip.

High-end Power boxes can take several TB of memory and IBM sell quite a few to clients who want a shared memory number crunching system (which can't be done with clusters of commodity x86 boxes). A big shared-memory box with GPU level floating point throughput might be quite a win for IBM in some markets.

Re:Whats a Future Power Road Map? (2, Insightful)

Jesus_666 (702802) | about 4 years ago | (#33867632)

For games and graphics this it thought to be good, hence its inclusion in the playstation 3.

Of course game developers tend to be a bit more sceptical. The Cell requires a very specific way of programming (don't align your data flow to the processor's capabilities and performance nose-dives), which doesn't go over well with people who have limited time to make their game/engine work on several different platforms, most of which work roughly the same.

I attended the Games Convention Developers Conference 2008. A number of panelists mentioned that what they presented was harder to get working on the Cell due to its unique requirements. It really does require a different approach to every other system on the market.

Add to that the fact that the PS3 doesn't appear to deliver obviously superior performance to the more conventional X360 and the question arises whether the Cell is worth the hassle in the gaming sector. Scientific programming can afford to write system-specific code and jump through hoops to attain maximum performance (after all, 10% faster execution speed may mean their calculations finish a month or more sooner). Game developers, on the other hand, are on a very tight development schedule and might make a better game with a sightly less powerful but conventional platform to develop for.

PS3 Graphically Annihilated The 360 This Gen (-1, Troll)

Anonymous Coward | about 4 years ago | (#33867706)

"Add to that the fact that the PS3 doesn't appear to deliver obviously superior performance to the more conventional X360"

Boggle.

Are there really Xbox fanboys THAT pathetic and delusional.

Guy, it's one thing to root for your crappy and underpowered Xbox 360 console. It's another thing to be off in la-la land denial that the Xbox 360 has gotten destroyed graphically this gen.

The Xbox 360 has gotten so badly beaten this gen by the PS3's Cell+RSX rendering system that it is the first console in history to have its fans have to resort to babbling about 'teh multiplats look better' because the exclusive title graphics of the 360 are not even in the same league as the PS3.

Microsoft's botched and wimpy Xbox 360 graphics hardware has turned out so bad this gen that the only thing the underpowered console had to show for itself was inane 5000x5000 fake marketing shots from Epic for their laughably outdated Unreal engine and its massive overuse of 'bumpy shiny normal maps everywhere!'

"the Cell is worth the hassle in the gaming sector."

LOL! Awww, you mean all those x8/Windows devs hacks who cried cuz they didn't understand teh big bad Cell chip. Poor babies!

Go away fanboy.

LOL? What? (-1, Troll)

Anonymous Coward | about 4 years ago | (#33867960)

Go back to playing Halo(looks just as good as Killzone 2! cuz teh Xbox is 'just as powerful as teh PS3 LOL) instead of trying to pretend you have a fucking clue about console game development fanboy.

Re:LOL? What? (1)

berwiki (989827) | about 4 years ago | (#33868238)

killzone doesn't look better than modern warfare 2. and that game is on both platforms.

you are the one who sounds like a damn fanboy.

Xbot Fail (-1, Flamebait)

Anonymous Coward | about 4 years ago | (#33868400)

Xbots, the fucking shit stains of the gaming world.

First Killzone 2 was 'impossible Sony lies!!!' cuz no game could ever have CGI level graphics like those shown.

Now, that fucking sub HD graphical turd MW2 is in the same league.

Someone needs to smack the fucking shit out of your faggot ass.

Re:Whats a Future Power Road Map? (1)

cK-Gunslinger (443452) | about 4 years ago | (#33868206)

I think the true power of the PS3/Cell will be it's longevity.

Look at the PS2. Now look at the 1st gen games for it versus some of the latest ones. The differences are huge, and they are due purely to better programming techniques (same hardware.) I've no doubt that the PS3/Cell will have a similar lifespan.

Also, I know it discussed in almost every tech generation of consoles, but this time it might be true: Is the hardware finally good enough? This may be directly influenced by the popularity of Flash-based and iPhone games. Is the game market still being driven by the faster-polygon-pushing race? Maybe not..

one Power core, not two (1)

electrosoccertux (874415) | about 4 years ago | (#33868182)

Just FYI.

IMO this was one of the main failures of the architecture. Xbox360 developers just have to worry about parallelizing their code, Cell developers on top of that have to worry about writing code that can make use of the SPE's, let alone efficient use of them.

The Cell was designed back when Sony needed hardware that could decode their high definition blu-ray streams. I think this is why the SPEs are useful for decoding operations and little else in the gaming world.

Re:Whats a Future Power Road Map? (1)

Sockatume (732728) | about 4 years ago | (#33868590)

Interestingly, Cell was tolerant of losing SPUs in manufacture. A lot of "bad" chips would've been used as lower-end Cells for cheaper devices, while being essentially the same platform as far as developers were concerned. I don't think much came of that though. One laptop with a 4-SPU Cell, talk of a 2-SPU Cell as a video processor in a high-end HDTV. A shame, really, as they had a lot of half-dead Cells rolling off the line when they were trying to crank them out for the PS3 launch. Wonder what happened to them.

Re:Whats a Future Power Road Map? (1)

StuartHankins (1020819) | about 4 years ago | (#33869200)

Yes, your response was informative and I know you're trying to help.

But seriously, if this person has no idea what a Cell processor is, I'm pretty sure the concept of CPU optimization will be lost on them. You could say it was a new type of chip made by elves to regrow tissue and they would probably believe it. Just how out of touch would someone have to be to miss the Cell, and not bother to Google it before posting?

Re:Whats a Future Power Road Map? (0)

Anonymous Coward | about 4 years ago | (#33866654)

The Cell architecture was used in the Playstation; it is designed to have many simple cores working in parallel. It is good at embarassingly parallel tasks like streaming video and rendering, but that is really all it is good at -- the individual cells currently have working sets much to small for HPC.

We like money! (4, Insightful)

allaunjsilverfox2 (882195) | about 4 years ago | (#33866436)

What business would want to give up guaranteed sales? I mean, a gaming platform is like walking into a bank, depositing one cent and then getting a cent every second until the bank closes.

Multicore for raytracing? (3, Insightful)

gizmod (931775) | about 4 years ago | (#33866438)

Bring on a 12 core PS4 with raytracing games.

Re:Multicore for raytracing? (2, Informative)

Anonymous Coward | about 4 years ago | (#33866558)

The Atari Transputer Workstation [wikipedia.org] already did that in the 80s. Coolest real-time raytracing ever!

Re:Multicore for raytracing? (1)

CarpetShark (865376) | about 4 years ago | (#33866996)

Hmm. I'd never looked into the Atari Transputer much. I figured it was a lot like an Amiga 2000/3000, but overhyped, and with GEM :) Turns out it was quite a machine, with a lot of innovation that's only catching on in PCs now. If it wasn't for the lack of an MMU, I might have liked to see it replace both Amigas and PCs :) Also, a lot of the stuff here:

http://en.wikipedia.org/wiki/Transputer [wikipedia.org]

Sounds like a summary of the Cell's raison d'etre.

Couple of questions:

* Is there an emulator of this, so I can check out how usable it was?

* I've read that it was capable of ~260 megaflops, which does seem to be in the realtime raytracing ballpark. However, I think that was a fully-equipped or high-end version? How did the pricing/configs work?

* What's this about a stack-based architecture that made having no MMU less of a problem?

Yes, but OCCAM... (0)

Anonymous Coward | about 4 years ago | (#33867300)

...they copied this crappy Python leading-whitespace syntax. That's why they failed, I guess.

(runs to hide) Hey, folks. Just kidding. Hey!

Re:Yes, but OCCAM... (1)

AlecC (512609) | about 4 years ago | (#33867340)

They originated it, not copied it.

Re:Multicore for raytracing? (0)

Anonymous Coward | about 4 years ago | (#33867374)

* What's this about a stack-based architecture that made having no MMU less of a problem?

That statement struck me as well. I think that one requires a "citation needed" since I've never heard that before. Are stack machines unable to address memory outside the stack? That's rather unlikely. Perhaps because they do it less, the Wiki editor figured it was "less" of a problem.

Re:Multicore for raytracing? (1)

AlecC (512609) | about 4 years ago | (#33867382)

There is an emulator, under active development, See posts on newsgroup com.sys.transputer.

The 260 megaflops must be for some kind of an array - they were designed to be used in arrays. The individual transputers never clocked faster than 25 MHz, though the FPU on the T800 was relatively fast for the time. Each transputer had four bidirectional links connected to DMA engines wired directly into the hardware scheduler, so that inter-processor communications were very low cost.

I can't see why the architecture made no MMU less of a problem. The architecture was designed with very few registers, which made thread switching (implemented in hardware) a very lightweight operation. It was excellent at multithreading (for which you don't need an MMU) but simply didn't do multiprocessing, full stop.

It was a beautifully engineered device - but the problem space it was engineered for was too small for it to be viable. One of the problems was that the four links only allowed the problem to be mapped in 2-D - if your problem didn't flatten to 2-D, it would go like a dog.

Re:Multicore for raytracing? (1)

robthebloke (1308483) | about 4 years ago | (#33869778)

No please don't. We've got enough on our plate with the PS3 as it is.....

So where's the story here? (0)

Anonymous Coward | about 4 years ago | (#33866440)

So what? This deserves a post?

Why not tell us that Intel is going to continue to make chips too?

Re:So where's the story here? (4, Informative)

KugelKurt (908765) | about 4 years ago | (#33866510)

The story is not that IBM continues to manufacture chips but that the Cell design is not dead. This contradicts earlier stories to some degree.
In all fairness, it contradicts only on the surface as IBM only stated in the older story that Cell as separate design will end and its co-processor-heavy design will merge with future POWER iterations.

There were also rumors that IBM won't manufacture PS3 Cell CPUs any longer, leaving it to contractors.

Re:So where's the story here? (0, Offtopic)

c0lo (1497653) | about 4 years ago | (#33866598)

The story is not that IBM continues to manufacture chips but that the Cell design is not dead.

To be frank, while playing games, I still preffer Pringles [wikipedia.org] to any other chips!
(in other words: WTF should I care what chips is my gaming machine using??)

Re:So where's the story here? (0)

Anonymous Coward | about 4 years ago | (#33866694)

The intersection between hardware geeks and gamers is not nearly as narrow as you may assume it to be, so a lot of people are still interested in this.

At the very least, you should acknowledge that the continued development of gaming devices (and associated technology) is spreading out into improvements in many other fields of technology, some of which you may find more interesting/relevant to your everyday life.

Re:So where's the story here? (1)

c0lo (1497653) | about 4 years ago | (#33866742)

At the very least, you should acknowledge that the continued development of gaming devices (and associated technology) is spreading out into improvements in many other fields of technology, some of which you may find more interesting/relevant to your everyday life.

I acknowledge it if you like. But I fail to see how the Cell chip, in particular, has achived this: all the improvements in the technology, only Mercury computers [wikipedia.org] are not related to gaming.
Yes, until some time ago,one could run Linux on PS3 (thus making use of the Cell chip outside the entertainment area)... but the rumors have it as no longer possible.
Do you know otherwise?

Re:So where's the story here? (0)

Anonymous Coward | about 4 years ago | (#33866702)

You are on the wrong website.

Re:So where's the story here? (0)

Anonymous Coward | about 4 years ago | (#33866732)

One word: eiw. Save your taste buds and try some Utz [utzsnacks.com] instead, bruddah!

Re:So where's the story here? (1)

AlecC (512609) | about 4 years ago | (#33867394)

It is actually one branch of what is appearing to be a fork in gaming machines: ultra-high-performance renderers like the PS3, and peripheral driven lower performance systems like the Wii, Some people have sid that the Wii is the way of the future, current generation renderers do all the graphics you need, gaming developments will be in the UI not in graphics. This is a step down the opposite path: we can ans should g3t better graphics.

Re:So where's the story here? (4, Interesting)

hairyfeet (841228) | about 4 years ago | (#33867688)

I have been wondering just how long it will take for the "ooohhh shiny!" factor to wear thin. Hell I fire up Far Cry I or Wolfenstein on my $36 HD4650 and the people stand around and go "oooohhh". You really don't need any higher to have decent immersion in a game, and especially with FPS if the game is worth a damn you are too busy dodging fire to just stand around and look at the shiny. Then add in the spiraling costs and delays to market adding lots of "ohhh shiny" add, and it quickly becomes "get a hit, and on time, or we'll all out of business" and that simply isn't sustainable long term.

That is why I wouldn't be surprised if the next gen gaming consoles don't do something similar to the original Xbox, which I thought was a damned good idea at the time. You could take a cheap ULV Phenom II Quad, add a 5xxx Radeon GPU and some decent controllers and have the average Joe drooling at the "ooohhh shiny" for a long time, and the combination of cheap hardware, the ability for developers to easily code with tools they already have, and the quick time to market would probably make it a hit.

I just don't see the incredible amounts required to bring a new gen of consoles not seriously hurting any companies bottom line. With a more off the shelf approach all they have to do is cook up the DRM and a close to bare metal OS for it and let the economies of scale keep the price low out the gate and drive prices even lower as time goes on. While MSFT could blow the cash simply because they have twin cash cows in Office and Windows, I doubt Sony will be able to afford the needed capital, and Nintendo has made it pretty clear they aren't gonna play the "ooohhh shiny!" game at all with targeting the Wii to casual gamers. I just don't see a never ending ooohhh shiny arms race being good for anybody. Just look at how ATI is using Eyefinity to push new GPUs and Nvidia looking at HPCs with CUDA, even they know the "ooohhh shiny" can only go so far. Hell I figured when I got the HD4650 it would just be a stopgap until I could get a $150+ GPU, but now? Hell it plays Bioshock II and everything else I throw at it with plenty of ooohh shiny and doesn't turn my apt into a sauna bath, so why bother? I used to be a serious graphics whore, but even I got tired of the ooohh shiny and now prefer games that are actually...what's the word?...oh yeah FUN. I'm starting to wonder if the whole graphics race is starting to hit a dead end.

Re:So where's the story here? (0)

TheKidWho (705796) | about 4 years ago | (#33868456)

The reason your HD4650 is sufficient is because you're playing essentially console games on a graphics card thats slightly more powerful than the consoles themselves. Also you're only running at a low resolution, try gaming at 1920x1080 on your 4650, it won't get you too far.

That is why I wouldn't be surprised if the next gen gaming consoles don't do something similar to the original Xbox

There is a reason they didn't do that again, it's because it's more expensive than a dedicated console...

Re:So where's the story here? (1)

l33t gambler (739436) | about 4 years ago | (#33868706)

Yeah until you see Modern Warfare 2 at 1920x1200, highest setting on an LCD monitor. It looks so sharp and the spectral and bump mapping and huge texture resolution really blew me a way. The hundreds of particles from various fires in the game, for instance the tree on fire in the sub urb map is an amazing sight I've never seen before. At 120hz for a more solid experience when you look around.

But yes gameplay is just as important. Monsters in Doom 1 and 2 that had pixelated blood splatters around the walls, and being stunned when hit is something i've missed in current FPS games. Doom 3, Quake 4 and Prey are pretty dumb. Red puffy sprites as blood and no depth in the gameplay. Destructive environments we had a long time ago, and been missed. The depth of games like System Shock 2 and Baldur's Gate might never come again. Many courses to choose, several ways to complete a level, hundreds of character development options. Add in an incredible story, excellent voice acting and exceedingly well written sentences and dialogue. Dragon Age is childish in comparison and not worth a "Baldur's Gate spiritual successor" in any means.

http://jooh.no/web/bloodshot_sprite_texture_puff_quake_3.jpg [jooh.no]
http://jooh.no/web/Doom2_pixellated_blood.png [jooh.no]
http://jooh.no/web/XCom_UFO_what_went_off_here_640.png [jooh.no]
http://jooh.no/ss_baldurs_gate.html [jooh.no]

affordable (1)

aerton (748473) | about 4 years ago | (#33866454)

I wish I could buy a consumer-priced system with one of these CPUs. A very interesting system to develop for. After all, we all are going to use some kind of system with the separate memory model, more like this, when we will come to the end of scalability of the currently dominating multicore CPU with common memory space.

I hope that PS4 (or other console using it) will be linux-friendly as PS3 was until Sony blew it. Alas, however slim this chance is, there seem to be no better chance.

Re:affordable (1, Informative)

Nursie (632944) | about 4 years ago | (#33866586)

Get PS3 with 3.41 or earlier firmware -> Jailbreak -> install linux.

Profit?

(actually linux for jailbroken ps3s is in the very early stages, but I'm sure it'll get there.

Makes more sense to buy into a... (1)

EzInKy (115248) | about 4 years ago | (#33867306)

...platform where you don't have to worry about some idiot company dictating what software you run on the hardware you purchased, don't you think?

Re:Makes more sense to buy into a... (1)

Nursie (632944) | about 4 years ago | (#33867366)

Oh sure, but if your aim is to try out cell programming, then that's pretty much your only option at present!

It would probably be better to try using CUDA and your graphics card...

Re:affordable (0)

Anonymous Coward | about 4 years ago | (#33868366)

Get PS3 with 3.41 or earlier firmware -> Jailbreak -> install linux.

Ahhh yes, another slashbot who totally ignores that fact that ANYONE could install Linux on a PS3 for well over 3 years. Suddenly it's only a reasonable option because Sony put up some roadblocks.

Re:affordable (1)

Nursie (632944) | about 4 years ago | (#33869022)

What?

Why the insults? I'm perfectly aware that circumvention is only necessary because of Sony asshattery, but as of *now* you can get a machine with firmware up to 3.41 and jailbreak it. Initial booting of linux has happened and I would expect in a few weeks or a couple of months to see it made relatively simple.

To spread more (1)

AHuxley (892839) | about 4 years ago | (#33866482)

game over more cores with less heat.
Great, but where is the software expert side going to come from?
It seems to take years for any 3rd party to work out how to optimise "anything" HD for the systems.
With a push for more cores how about a push for more developer support vs "cloud-based" and p2p servers.

With more memory per CPU, it might not suck (5, Interesting)

Animats (122034) | about 4 years ago | (#33866498)

The basic problem with the Cell processor is that it has 256KB (not MB, KB) per processor, plus a bulk transfer mechanism to main memory. Given that model, it has to be programmed like a DSP - very little state, processing works on data streams. For games, this sucks. No CPU has enough memory for a full frame, or for the geometry, or a level map. Trying to hammer programs into that model is painful. (Except for audio. It's great for audio.) In many PS3 games, the main MIPS machine is doing most of the work, with the Cell CPUs handling audio, networking, and I/O. And, of course, Sony had to put an NVidia graphics processor in the thing late in the development cycle, once people finally realized that the Cell CPUs couldn't handle the rendering.

But if each Cell CPU had, say, 16MB, the Cell machines could be treated more like a cluster. Programming for clusters is well understood, and not too tough.

It's probably too late, though. Multi-core shared memory cache-consistent machines are now too good. It's not necessary to use an architecture as painful as the Cell. It's probably destined for the graveyard of weird architectures, along with data flow machines, hypercubes, SIMD machines, systolic processors, semi-shared-memory multiprocessors, and similar hardware that's straightforward to build but tough to program.

Re:With more memory per CPU, it might not suck (1)

bhcompy (1877290) | about 4 years ago | (#33866530)

Well, IIRC it's intended purpose was for embedded devices. They were talking about smart fridges, security systems, etc. Basically networking your home with smart devices that were running on Cell and then being able to use that processor juice distributed across the devices since the Cell scales very well(which is why the PS3 makes a great supercomputer farm)

Re:With more memory per CPU, it might not suck (1, Informative)

Anonymous Coward | about 4 years ago | (#33866914)

No; no you don't recall correctly, not even a little bit. Not a jot, not a tittle. Cell was designed specifically for the PS3, and maybe for other kinds of (repetitive streaming type) work that is mostly done by GPUs and/or CUDA in this day and age.

You are right, and Sony killed any future... (1)

EzInKy (115248) | about 4 years ago | (#33867350)

...the Cell might have had when they locked down the PS3.

Re:With more memory per CPU, it might not suck (1)

KingFrog (1888802) | about 4 years ago | (#33866906)

Yep, Cell is being used far outside its original design spec. Of course, if gaming consoles is its current largest market, the next generation will probably look much more like a standard POWER6 or 7 in its architecture - more emphasis on more powerful support cores, more memory per core, and all the other things that have made their way into every other CPU family currently popular.

Cell Is Being Used Exactly How It Was Designed For (2, Interesting)

MediaStreams (1461187) | about 4 years ago | (#33868030)

Back in the early PS2 we would talk about what a next generation PS2 would look like. Those whiteboard diagrams looked almost identical to what Sony and IBM came up with.

The parallels between the PS2/EE/GS and PS3/Cell/RSX are almost identical:

Execution starts on the EE/PPU
Heavy/parallel computation task is spawned off to the VUs/SPUs
Light control code runs in parallel on the EE/PPU
As graphical elements become read to be rasterized they are spawned off to the GS/RSX

In a well running PS2/PS3 engine all three major areas are running full speed in parallel. Split memory architecture lets each area of the machine run at full speed without interfering with the rest of the system.

Kutagari and IBM did a masterful job. It was an obvious choice to build off the model of the most sucessful console architecture in history and the one all console developers had intimate knowledge of, the 145 million selling PS2.

Re:Cell Is Being Used Exactly How It Was Designed (2, Interesting)

CronoCloud (590650) | about 4 years ago | (#33868296)

I remember reading somewhere that one of the goals in PS2 programming was keeping that DMAC running full tilt streaming data. Ah, found it, Ars Technica:

http://arstechnica.com/old/content/2000/04/ps2vspc.ars/4 [arstechnica.com]

Valves hybrid threading (3, Informative)

l33t gambler (739436) | about 4 years ago | (#33868768)

I found this article interesting. They write about Valves approach to multi-core CPU's and game engines.

The programmers at Valve considered three different models to solve their problem. The first was called "coarse threading" and was the easiest to implement. Many companies are already using coarse threading to improve their games for multiple core systems. The idea is to put whole subsystems on separate cores; for example, graphics rendering on one, AI on another, sound on a third, and so on. The problem with this approach is that some subsystems are less demanding on CPU time than others. Giving sound, for example, a whole core to itself would often leave up to 80 percent of that core sitting unused.

The second approach was fine-grained threading, which separates tasks into many discrete elements and then distributes them among as many cores as are available. For example, a loop that updates the position of 1,000 objects based on their velocity can be divided among, say, four cores, with each core handling 250 objects apiece. The drawback with this approach is that not all tasks divide neatly into discrete components that can operate independently. Also, if some entries in the list take longer to update than others, it becomes harder to scale the tasks evenly across multiple cores. Finally, the issue of memory bandwidth quickly becomes a limitation with this method. For certain specialized tasks, such as compiling, fine-grained threading works really well. Valve has already implemented a system whereby every computer in their offices automatically acts as a compiler node. When the programmers were getting ready to demonstrate their results on the conference room computer with the big screen, they had to quickly deactivate this feature first!

The approach that Valve finally chose was a combination of the coarse and fine-grained, with some extra enhancements thrown in. Some systems were split on multiple cores using coarse threading. Other tasks, such as VVIS (the calculations of what objects are visible to the player from their point of view) were split up using fine-grained threading. Lastly, whenever part of a core is idle, work that can be precalculated without lagging or adversely affecting the game experience (such as AI calculations or pathfinding) was queued up to be delivered to the game engine later.

Valve's approach was the most difficult of all possible methods for utilizing multiple cores, but if they could pull it off, it would deliver the maximum possible benefits on systems like Intel's new quad-core Kentsfield chips.

To deliver this hybrid threading platform, Valve made use of expert programmers like Tom Leonard, who was writing multithreaded code as early as 1991 when he worked on C++ development tools for companies like Zortech and Symantec. Tom walked us through the thought process behind Valve's new threading model.

http://arstechnica.com/gaming/news/2006/11/valve-multicore.ars [arstechnica.com]

Re:With more memory per CPU, it might not suck (0)

Anonymous Coward | about 4 years ago | (#33867282)

256K should be enough for anybody,

But seriously, don't the memory buses in NUMA machines saturate at about 16 cores (depending on the problem set, of course).

No, the basic problem with the Cell... (0)

EzInKy (115248) | about 4 years ago | (#33867344)

...processor is that the company selling it's flagship product decided to lock out people wanting to experiment with it.

Re:No, the basic problem with the Cell... (1, Insightful)

Anonymous Coward | about 4 years ago | (#33867672)

...processor is that the company selling it's flagship product decided to lock out people wanting to experiment with it.

Because those people made such progress after having nearly four years to experiment? It's time people around here quit pretending like Sony never gave them the chance to dink around with the PS3.

Re:No, the basic problem with the Cell... (0)

drinkypoo (153816) | about 4 years ago | (#33867860)

...processor is that the company selling it's flagship product decided to lock out people wanting to experiment with it.

Fail. The flagship Cell processor is a more-capable unit that IBM will sell you for exorbitant amounts of money. The Cell in the PS3 is a toy version and even mentioning that it is based on cell is only marketing for the real thing to IBM.

So MS's 360 Got Humiliated By IBM's Toy Version? (-1)

Anonymous Coward | about 4 years ago | (#33867918)

"The Cell in the PS3 is a toy version and even mentioning that it is based on cell is only marketing for the real thing to IBM."

LOL, fanboys.

That's gotta sting Xbox 360 developers - to have fanboys calling the chip that beat the shit out of you this gen called nothing but a 'toy version'.

Re:So MS's 360 Got Humiliated By IBM's Toy Version (1)

drinkypoo (153816) | about 4 years ago | (#33868086)

That's gotta sting Xbox 360 developers - to have fanboys calling the chip that beat the shit out of you this gen called nothing but a 'toy version'.

The Xbox 360 is also powered by a 'toy version' of PowerPC which is a 'toy version' of POWER.

Also, I think Sony, Microsoft, and Nintendo are all evil, and I do my best not to give any of them money any more. That means buying everything used and not paying for Live Gold. If that makes me a fanboy, then your comment makes you my bitch. But we knew that already because you're an anonymous pussy.

Re:No, the basic problem with the Cell... (0, Troll)

Lunix Nutcase (1092239) | about 4 years ago | (#33868162)

And nothing of value was lost to them. The only thing related to the PS3 that interests Sony is the selling of games, Blu-Rays and stuff from PSN. A bunch of basement dwellers installing Linux on their PS3 was an afterthought at best.

Re:With more memory per CPU, it might not suck (0)

Anonymous Coward | about 4 years ago | (#33867402)

Mod up.
Poster is almost correct, but now there are clever multiprocessor GPU's. Same result, less trouble.
There is only one market for core - fluid dynamics, and the market share for this outcome is less than gamer gpus = game set and match.

NVIDIA delivers GPUs with up to 30 multiprocessors and 240 TPs. In each clock, each TP can produce a result, giving this design a very high peak performance ...and you can add 2 or more to one 6-8 way fairly decent CPU without having to get a 'new and different' headache and link http://www.thedailytech.com/nvidia-geforce-gtx-460-debut-multiprocessor-graphic-card.html.

Modified to deliver full on crypto and tor connections, big brothers snoopathon party could be in for a rude shock, parallel CBC and giant s and p boxes for aes/blowfish.

Re:With more memory per CPU, it might not suck (0)

Anonymous Coward | about 4 years ago | (#33867678)

Very valid points indeed. In fact, if you read the interview, at no point is there any mention of a road map, or future expansion on this concept, or enhancement of the Cell architecture. There were (still are) rumours that the PSP2 will contain a 4-SPU variant of the Cell, but I don't see a potential PS4 going down this route. There are better solutions out there now - with the main selling point being 'easier to program'. I know nothing about clustering, but sounds exciting as a future posibility. My only concern for PS4 (if there is one), is cost. PS3 when it first shipped, the cost was hyper-inflated due to the Cell dev costs and yield. I can see the same happening again. Interesting times though.

Laughable Drivel (2, Interesting)

RingBus (1912660) | about 4 years ago | (#33867782)

"Sony had to put an NVidia graphics processor in the thing late in the development cycle, once people finally realized that the Cell CPUs couldn't handle the rendering."

My god. You are repeating that Beyond3d forum lie in late 2010???

"For games, this sucks"
"Trying to hammer programs into that model is painful. (Except for audio. It's great for audio."
"In many PS3 games, the main MIPS machine is doing most of the work, with the Cell CPUs handling audio, networking, and I/O."
"It's not necessary to use an architecture as painful as the Cell."
"tough to program."

It's like you tried to parrot every Beyond3d x86 fanboy talking point you could remember.

Re:Laughable Drivel (0)

Anonymous Coward | about 4 years ago | (#33867814)

>>"Sony had to put an NVidia graphics processor in the thing late in the development cycle, once people finally realized that the Cell CPUs couldn't handle the rendering."

>>My god. You are repeating that Beyond3d forum lie in late 2010???

x86 fanboys embraced this lie because it lets them pretend poor liddle Sony 'hyped teh Cell' and then had to come crawling to NVidia for help.

It doesn't matter if it is laughably false. They will never stop repeating it. Not that it really matters with just how badly the PS3 has beaten the Xbox 360 graphically. This was crap x86 and Xbox 360 fans latched onto back in 2006/7 when they were desperately trying to spread lies about the performance of the PS3.

PS3's Cell+RSX Graphical Dominance (0)

Anonymous Coward | about 4 years ago | (#33867888)

You would think that in late 2010 that the x86 fans would be smart enough to just keep their mouths shut to avoid looking like angry, delusional fans butt-hurt over how amazing PS3 games have turned out compared to the significantly weaker Xbox 360.

Uncharted
Uncharted 2
Killzone 2
Gran Turismo 5

just to name a few of this gen's graphical kings.

With all of Microsoft's billions, all the idiotic fanboy babble about 'easy to program' Xbox 360, all the idiotic fanboy babble about 'hard teh program PS3(Cell)' and yet the Xbox 360 years on still doesn't even have a game that is up to the 3.5 year old Uncharted on the PS3.

There isn't a single graphical area the PS3 hasn't destroyed the Xbox 360 this gen:

Resolution
Materials
Lighting
Poly counts
Screen complexity/number of objects
Particle effects
Animation
Deformation

Games like Gran Turismo 5 are running at 2.25 times the resolution of Microsoft's first party Forza games while running an engine that looks a generation ahead.

So yeah, I'm sure if you are some x86/DirectX,Windows,Desktop PC company you 'hates teh Cell' because you don't have a fucking clue how to handle anything outside that sad and narrow little world. Sucks to be them.

 

How could Microsoft so badly botched the 360? (0)

Anonymous Coward | about 4 years ago | (#33868220)

You really have to wonder about all the now obvious lies spread about the relative performance of the PS3 and Xbox 360 how Microsoft could have gotten the 360's graphics hardware so terribly wrong and unable to compete with the PS3.

The PS2 easily put the Dreamcast to shame graphically, but it is nothing like what the PS3 has done to the 360 this gen. Last gen the PS2,GameCube, and Xbox were all putting out roughly the same number of polys in high end games(10-20 million or so), were running the same resolution with a few exceptions on each platform, had similar poly counts, etc. Each console did have areas where it excelled at - GameCube's quick seeking drive and its fast RAM, the PS2 insanely fast eDRAM and massive floating point power, and the Xbox was good at multipass rendering.

You really have to wonder what the hell Microsoft was thinking. First they gimp the 360 with the 6GB disk format making it the only console in history to have less space than a previous gen. Then they gimp the 360 with eDRAM that was too small to fit a standard 4xAA 720p frame buffer making the machine a nightmare to work with for developers. At least they dumped the horribly outclassed x86 chip for an IBM rush job where they slapped a third core on one of their existing designs. Still nothing the could compete in any way with the PS3's Cell chip.

Even a company with no console hardware design competence had to know they were dooming the console to be outclassed by the PS3.

And look at what the 360's graphical legacy turned out to be: the hilariously fake Epic Gears of War marketing shots, faked side by side multiplatform comparisons by fanboys messing with the video settings on the PS3 version or playing games with image compression so the PS3 versions look more jaggy and have less detail, and now the Xbox has resorted to pretending the fake marketing promo shots from highend PC games will look just like that on the 360. The 360 is the first console in history where there isn't a single exclusive game that is of any note graphically, let alone that in remotely close to PS3 graphics levels.

Most likely Microsoft has realized even with blowing billions they can't compete with Sony's dominance in console graphics hardware and have instead turned their attention to trying to pull a Wii type move by slapping those Eye Toy style motion controls on the old 360 hardware.

Animats Just Another Xbox Fanboy (0)

Anonymous Coward | about 4 years ago | (#33867986)

The clown has been spewing the same copy and paste garbage on this(and mostly likely everywhere else he frequents on the Net).

Hush (0)

Anonymous Coward | about 4 years ago | (#33868236)

Now all that's needed is someone to copy-paste the parent post into relevant wikipedia article,
then his words will be pure gold. As evidenced by his +5 vs your +3, you stand no chance anyhow.

Re:With more memory per CPU, it might not suck (1)

shadowofwind (1209890) | about 4 years ago | (#33868164)

The bandwidth in and out of those tiny spu memories is great, much better than between main memory and cache on an x86 processor, or generally between cache and processor on a GPU. I don't know what anyone needs that for though.

Re:With more memory per CPU, it might not suck (2, Interesting)

CronoCloud (590650) | about 4 years ago | (#33868234)

The SPE's aren't full CPU's, they're essentially enhanced versions of the PS2's VU's.

Given that model, it has to be programmed like a DSP - very little state, processing works on data streams.

Yep, stream data, just like on the PS2.

For games, this sucks. No CPU has enough memory for a full frame, or for the geometry, or a level map.

You're not supposed to keep a full frame or map in there, you're supposed to stream it in and out on the fly, as the Kami intended, just like on the PS2.
"Fat Pipes (bandwidth), small pans (VU/SPE RAM)" http://arstechnica.com/old/content/2000/04/ps2vspc.ars/1 [arstechnica.com]

In many PS3 games, the main MIPS machine is doing most of the work, with the Cell CPUs handling audio, networking, and I/O

The Cell isn't MIPS, it's PPC, the PS2 (and PS1) were the MIPS machines. The SPE's are supposed to handle things like audio and networking, that's their job. Apparently you can also do things like assign a SPE to do things like very fast bzip decompression.

You're wasting your breath on an xbox fanboy (0)

Anonymous Coward | about 4 years ago | (#33868832)

The guy, just like so many other Xbox and PC gamer fans, is just regurgitating every little bit of crap he ever heard hoping to convince people that what they see with their own eyes isn't true. The PS3's Cell+RSX combo this gen has destroyed the Xbox 360 graphically.

It would be no big deal if fanboys like the OP were spewing this type of inane techno-babble back in 2005/6 when the PS3 was not yet released or just released. But in 2010. That's just sad and pathetic.

Xbox and PC gamer fans filled their head with garbage from sites like the beyond3d formums because it told them what they desperately wanted to believe, that it was all lies from Sony and 'teh Cell hype'. Fine, no big deal. But as graphical masterpiece after graphical masterpiece came out on the PS3 that destroyed any Xbox 360 game came out they circled the wagons instead of doing the rational thing of clinging to the crap they filled their head with from the beyond3d forums.

I remember the socalled 'experts' aka desktop PC programmers going on and on and on about how the Killzone 2 footage was simply 'impossible' for the PS3 to ever run. They spouted post after post foaming at the mouth about how it was all 'Sony lies'.

And then:

http://generationdreamteam.free.fr/afrika/killzone2/KillZone2compa.jpg [generation...am.free.fr]

The real time PS3 Killzone 2 demo came out. It was like there was going to be mass Xbox and PC gamer fan suicides over that. All the lies and bullshit they kept telling themselves about the PS3, Cell, 'hard teh program', 'teh Xbox 360 GPU is better than teh PS3's' was made a mockery.

And this happened with PS3 exclusive after exclusive.

Each time the Xbox and PC fans would rush back to the beyond3d forums to get their talking points about why that latest PS3 didn't really look as good as everyone is seeing with their own eyes, and 'teh 360 could easily handle those graphics but dev just don't want to'.

And now it is 2010 and those same Xbox and PC gamer fans are still sitting around in forums spewing the same bullshit and lies. So sad and pathetic. I remember the Dreamcast fans being bad, but they mostly gave up trying to convince the world that the Dreamcast could keep up with the PS2 after a year or so.

Re:With more memory per CPU, it might not suck (1)

generationxyu (630468) | about 4 years ago | (#33868938)

Please don't confuse the SPUs (the eight coprocessors on the Cell die) with the PPU (the main CPU core). The PPU is also part of the Cell, so don't call the SPUs "Cell CPUs". There is also no MIPS core -- the PPU is a 3.2GHz PPC core with two hardware threads. The SPUs also run at 3.2GHz, but are not considered "real" CPUs since they can't bootstrap themselves, they have to be given tasks from the PPU. SPU programming forces a model on you as a developer -- modularize your tasks with as few synchronization points as possible and treat the SPUs like a thread pool. What's the problem here? This is a good model even if you're not limited to the SPUs. Developers who move more and more tasks to the SPUs will find themselves in a much better position next generation when parallelization is more massive, regardless of whether the Cell or something like it is involved.

Re:With more memory per CPU, it might not suck (0)

Anonymous Coward | about 4 years ago | (#33869794)

How is this getting modded up? The guy clearly doesn't understand the architecture of the PS3... he thinks theres a MIPS CPU in there, for christ's sake.

IBM - proven in this market space (2, Interesting)

KingFrog (1888802) | about 4 years ago | (#33866516)

I would not want to be betting against IBM for this marketspace. Their cell chip, which is an asymmetric multi-core CPU architecture, seemed bizarre when announced, but has proven to be quite good for these workloads. If IBM is looking to leverage their regular POWER chipset for the console market, they will probably build some screamers with them. Cell and POWER both have Unix and Linux adaptations running on them, so having the capability seems trivial. Whether vendors will want you using their hardware that way is another matter entirely. After all, the chief reason that console games cost so much is that for every copy sold, the developer pays the console hardware manufacturer a licensing fee. Unlike the PC arena, where the architecture is published and you develop for it for effectively no additional cost.

Re:IBM - proven in this market space (0, Troll)

TheRaven64 (641858) | about 4 years ago | (#33867318)

If IBM is looking to leverage their regular POWER chipset for the console market, they will probably build some screamers with them

All of the current generation consoles use IBM chips. The GameCube and Wii both used PowerPC 4xx series chips - IBM's low-end 32-bit PowerPC line. The XBox 360 uses a custom 3-core in-order PowerPC chip. The PS3 uses Cell (PowerPC core + 7 SPUs - the PS3 gets the ones where one of the SPUs failed the tests, the ones where all 8 work go into blades and supercomputers).

They could actually try to sell the Cell (4, Interesting)

dbIII (701233) | about 4 years ago | (#33866522)

A while back I was looking for one or two Cell CPU based machines as development boxes for inhouse geophysical software - basicly to see if it's worth going onto that platform. The three week process between contacting what appeared to be the only vendor of Cell based workstations and getting a price for an entry level machine was frustrating. It involved daily calls to a slimy bastard that appeared to just want to waste time trying to become my friend until he had carefully finished weighing my companies wallet.
In the end the time window had come and gone (the developers got bored or gave up on the idea of using the Cell) before I could get even a hint at the price but I kept going for the sake of future projects. The price for one workstation with one processor was fairly similar to that of six of our cluster nodes. You would need some sort of black-ops budget where any Accountants coming close are shot on sight before paying that sort of price. An entry point machine no much different to a playstation with more memory cost a truly insane and unjustifiable price.

Re:They could actually try to sell the Cell (0)

Anonymous Coward | about 4 years ago | (#33866578)

Yeah, it would be nice if Cell could actually be sold outside PS3s, and ARM designs outside cell phones.
It's a shame to see such interesting hardware going to waste on closed systems.

Re:They could actually try to sell the Cell (3, Informative)

Johnno74 (252399) | about 4 years ago | (#33866666)

Can't buy ARM outside a cellphone? Are you kidding?

Check this out - this is just one I found with about 5 seconds

http://www.makershed.com/ProductDetails.asp?ProductCode=MKND01 [makershed.com]

There are dozens of ARM boards out there suitable for DIY/embedded systems

Re:They could actually try to sell the Cell (0)

Anonymous Coward | about 4 years ago | (#33866964)

www.beagleboard.org
For $150 you get a 720 mhz, low-power, fan-less, efficient ARM design (the processor runs 2 instructions per cycle), implementing completely Open Hardware, with a dedicated gpu (up to 10 million polygons per second, decent), a DSP for rending 720p HD video, among a crapload of other features (http://beagleboard.org/hardware).

That is just one of many ARM options. Were you being sarcastic?

Re:They could actually try to sell the Cell (1)

oPless (63249) | about 4 years ago | (#33867122)

You're kidding right?

There's lots of ARM dev kits out there, chumby hacker board, netduino, cortex, etc.

Not forgetting where ARM came from (ARM = acorn research machine[s]?) there's an old list here:

http://productsdb.riscos.com/comp/curr.htm [riscos.com]

Then there is an desktop operating system for the above called RISCOS http://www.riscos.com/ [riscos.com]

Of course a lot of these pages are rather old now, and you'll find lots of broken links ... which can only tell you one thing....

ARM is an old cpu, only 10 years younger than the 8086, which has found it's niche in embedded systems rather than desktops.

Re:They could actually try to sell the Cell (1)

LWATCDR (28044) | about 4 years ago | (#33868912)

Really well here you go.
http://beagleboard.org/hardware [beagleboard.org]
http://gumstix.com/ [gumstix.com]
There are a lot more but beagleboard is the closest I have seen to a mini ITX board.
Just plug in a keyboard, mouse and monitor and you are good to go.

Re:They could actually try to sell the Cell (0)

Anonymous Coward | about 4 years ago | (#33866648)

You shoulda just got a Toshiba Qosmio G50 or F40 then

Re:They could actually try to sell the Cell (1)

statusbar (314703) | about 4 years ago | (#33866876)

That is true. I don't know about now but a few years ago you couldn't even get a pinout for the Cell processor. You had to show both IBM and Sony your business plan and your market could not impact Sony. IBM has some cookie-cutter circuit boards with a cell on them that they want to sell for big bucks, along with a big down payment and minimum quantities. The reality is that the Cell processor is not THAT great. good, but not great, and requiring a big change in the way you factor out your software design, and while it sounds exciting that you have all the 8 altivec processors, you hit memory bandwidth limitations very quickly unless your algorithm has little input, little output, and little intermediate data sizes.

--jeffk++

Re:They could actually try to sell the Cell (3, Informative)

TonyMillion (545370) | about 4 years ago | (#33866980)

odd, when we were working with cell we went straight to matrix vision and they LOANED us the hardware for about a year.. Nothing sleazy at all. IBM Also loaned us a server, as did Sony (a beautiful rack-mount job which will never see the light of day).

http://www.matrix-vision.com/products/cell.php?lang=en [matrix-vision.com]

Bottom Line - the PPC part of the Cell is rubbish, terrible IO and generally 'weak' by todays standards, the SPEs are great, but not enough memory on them (256k) for the algorithms + tables we needed to process the data.

In the end optimizing for intel & SSE3 and making the algorithms multi-core capable was less pain:performance ratio than working on the Cell which would have required all the additional work of managing DMA to/from the meagre memory on the SPE.

Oh Joy! A x86 Hack Babbling About Cell (0)

Anonymous Coward | about 4 years ago | (#33867754)

Sigh.

Never fails. Out come the x86 hacks babbling about how they'ed rather be working on their archaic piece of shit chip architecture.

What a fucking joke you are.

Re:Oh Joy! A x86 Hack Babbling About Cell (0)

Anonymous Coward | about 4 years ago | (#33869288)

Never fails. Out come the PPC trolls, still pissed because there is no future in the PPC architecture since IBM stopped caring. This pittance from IBM was just to shut you up for now. Don't you know there's no carrot, only stick?

What a poor, sad boy you are.

Re:They could actually try to sell the Cell (5, Interesting)

wowbagger (69688) | about 4 years ago | (#33867796)

I can go one better: I do signal processing for a living - chewing on multi-hundred megasample/second streams of data in real time. The Cell looked like a perfect fit. We were looking at 1000's per year. Contacted IBM - sorry, not enough zeros on that number for us to sell you the chips. OK, are there any vendors that are targeting the uTCA form factor (that the Telecomms folks are are all over, so they would not have been targeting just us)? Nope, just large blades for mainframes.

I assert that IBM doesn't want to be in the chip business - at least, not "selling chips to anybody else". They don't mind making chips for their own use, but they really don't have the infrastructure to sell to anybody else.

Sony and Toshiba don't want to be in the high-end CPU market, they want to be in the mass-market stuff.

Had IBM licensed the Cell design to somebody like Freescale, they might have gone somewhere.

Sorry, but I RTFA - and what I came away with was "We will continue to support Sony for as long as Sony wants to make PS3's". I saw nothing that really said "We are going to be going someplace else with this."

They're going to continue supplying? (0)

Anonymous Coward | about 4 years ago | (#33866534)

IBM is working with gaming machine vendors including Nintendo and Sony, said Jai Menon, CTO of IBM's Systems and Technology Group, during an interview Thursday. 'We want to stay in the business, we intend to stay in the business,' he said. IBM confirmed in a statement that it continues to manufacture the Cell processor for use by Sony in its PlayStation 3.

Is that surprising? Why would they stop? All three consoles currently use a POWER-related chip, it's not like they're going to just throw all those big stacks of money away for no reason. Was someone speculating that they were going to pull out or something? Why is this news?

The problem with Cell... (2, Insightful)

Entropius (188861) | about 4 years ago | (#33866952)

... is that it lies in between ordinary x86-type multicore processors and CUDA/GPGPU, and there's not much room in between.

Re:The problem with Cell... (0, Troll)

TheRaven64 (641858) | about 4 years ago | (#33867356)

It's more of a triangle. In one corner, you have general-purpose CPUs, optimised for branch-heavy code with lots of locality of reference. In another, you have streaming, often SIMD, processors optimised for non-branching code, with high throughput, such as GPUs and DSPs. In the third corner, you have specialised silicon dedicated to specific algorithms (e.g. building blocks for encryption algorithms or video CODECs).

Cell is along one side of this. It isn't particularly throughput-focussed, and it is optimised for locality of reference. You have to DMA in a block of memory (under 256KB), process it, and then store it. It is not optimised for branching. It is heavily SIMD. The real problem with it is that there aren't that many algorithms that benefit from this combination of CPU features.

The Cell is Affordable and has Floating Point (0)

Anonymous Coward | about 4 years ago | (#33867068)

The Cell is an affordable solution. The SPEs could be given MORE capabilities ...

Comparable processors (perhaps more advanced processors) from Tilera and Cavium (32, 64 and up to 100 cores) are very expensive ... IBM should create a Road Map for this processor ... it has floating point capability, something the other processors do not have.

x

"IBM's Plans For the Cell Processor " (0, Offtopic)

blind biker (1066130) | about 4 years ago | (#33867112)

I know Slashdot is the enemy of good writing practices, so this post will be modded downto hell, but I feel I must point out that lately, the capitalization of titles of Slashdot submissions got completely out of hand. The rule is simple: if you want to capitalize your headlines, you capitalize every word except
- prepositions ("of", "to", "in", "for", "with" and "on")
- articles ("the, "a" and "an")
- and some other obvious exceptions.

On Slashdot, the editors are so ignorant that they usually capitalize each and every word. But this title, "IBM's Plans For the Cell Processor ", shows that capitalizing every word is not even a policy!

mbt shoes (1)

aotian (1915400) | about 4 years ago | (#33867126)

Catalan star was nhl jerseys [nfljerseyspub.com] forced to put on the international nfl jerseys [nfljerseyspub.com] team mate and Carles Puyol, nba jerseys [nfljerseyspub.com] who seized the midfield, and mlb jerseys [nfljerseyspub.com] successfully pulled his shirt in front of soccer jerseys [nfljerseyspub.com] a crowd of fans, the first shirt. cheap nfl jerseys [jerseysol.com] Liverpool goalkeeper Reina, nfl jerseys [nfljerseyspub.com] who served as master of ceremonies at nhl jerseys [nfljerseyspub.com] the microphone of the occasion, announced the nba jerseys [nfljerseyspub.com] "Barcelona of the Future" Puyol and Pique hit. mlb jerseys [nfljerseyspub.com] However, Fabregas seems unwilling to long-term soccer jerseys [nfljerseyspub.com] preservation, it seems likely to cheap nfl jerseys [jerseysol.com] pose a clear explanation of theMBT shoes [buy2shoes.com] UAE’s shirt. Previously, cheap mbt shoes [buy2shoes.com] Fabregas gesture, announced wholesale ugg boots [buy2shoes.com] yesterday that he was proud as a Arsenal player. wholesale Christian Louboutin shoes [cl-shoescom.com] The midfielder whom seem to havesupply cheap Christian Louboutin shoes [cl-shoescom.com] his ’family’xxhhjjzz

Re:mbt shoes (1)

qmaqdk (522323) | about 4 years ago | (#33867620)

I think this guy is suggesting we buy some of his shoes, but I'm not quite sure.

Re:mbt shoes (0)

Anonymous Coward | about 4 years ago | (#33869596)

Spam Spam Spam yep it's spam harmed my computer [nfljerseyspub.com] [nfljerseyspub.com] spammer sucks, has poor-quality items [nfljerseyspub.com] [nfljerseyspub.com] is a scam site quit spamming here [nfljerseyspub.com] [nfljerseyspub.com] these people are crooks stole my money [nfljerseyspub.com] [nfljerseyspub.com] want my money back

complained to the supervisor [nfljerseyspub.com] [nfljerseyspub.com] would NOT recommend they were difficult to deal with [jerseysol.com] [jerseysol.com] the clothes were cheaply-made junk do not buy! [nfljerseyspub.com] [nfljerseyspub.com] spammed my mailbox again a bad deal [nfljerseyspub.com] [nfljerseyspub.com] do not buy from them!!!!! probably stole my credit card info [nfljerseyspub.com]

Hahahaha you people are stupid [nfljerseyspub.com] "Why did they steal my money?" I asked to speak to a supervisor. items were late [nfljerseyspub.com] [nfljerseyspub.com] Would NOT recommend NO WAY JOSE [nfljerseyspub.com] [nfljerseyspub.com] goth rules shitty service [jerseysol.com] [jerseysol.com] worship the dark oneRipoff!!! [buy2shoes.com] [buy2shoes.com] Satan Satan Satan cheap mbt shoes [buy2shoes.com] [buy2shoes.com] Midget fucking Cheap imitations [buy2shoes.com] [buy2shoes.com] Felching contest tomorrow Puked on his shoes [cl-shoescom.com] [cl-shoescom.com] RipoffGo fuck yourself [cl-shoescom.com] [cl-shoescom.com] Got the point, stupid?

--
http://www.mbt-shoes-suck.com

Perfect Cell (1)

Meneth (872868) | about 4 years ago | (#33867504)

I wonder what it's gonna have to absorb to evolve into the Perfect Cell. :)

what would be cool (2, Interesting)

AVryhof (142320) | about 4 years ago | (#33867610)

What would be a pretty cool chip would be an 8-core chip with 4 x86_64 cores, two graphics cores, and two Cell cores. (perhaps IBM + AMD working together)

After that, build a custom Linux with MeeGo as the front end / launcher.... It would be cool if game console makers embraced Open Source for everything up to launching the games. ...and if they don't want their SDK open source, that's fine, just make the Operating System so it can launch the games, then get out of the way. Run it on two cores (for better functionality with Multimedia capabilities, ebook reading, etc.) and use the rest of the cores (2 x86_64, 2 Graphics and 2 Cells) for gaming.

As for the other hardware, Composite, Component, HDML, VGA, WiFi, Ethernet, and a headphone jack.(maybe bluetooth for wireless controllers and the ability to use bluetooth headsets)..blu-ray, card reader, and USB.

This is all off the top of my head, and would be a pretty cool gaming console, which would truly capture the home entertainment medium and make most people looking for gadgets, consoles, or HTPCs drool appropriately.

Re:what would be cool (1)

drinkypoo (153816) | about 4 years ago | (#33868016)

What would be a pretty cool chip would be an 8-core chip with 4 x86_64 cores, two graphics cores, and two Cell cores. (perhaps IBM + AMD working together)

this is a bad idea because it's precisely the kind of ignorant crap that hypertransport is supposed to eliminate. Instead of cramming a bunch of crap into one package, you sell multiple packages so that people can customize their layout. Ideally you'd have the x86_64, cells, and graphics cores all communicating via HT links, and then it doesn't matter where they are physically located... but trying to put all that into one package would be a TDP nightmare at this point.

YUO FAIL IT? (-1, Redundant)

Anonymous Coward | about 4 years ago | (#33867928)

website. Mr. de the project as A marketing surveys irc.easynews.com do, or indeed what EVERY CHANCE I gOT fastest-growing GAY
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?