Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Technical Difficulty In Porting a PS3 Game To the PS4

Soulskill posted about 2 months ago | from the more-than-you-bargained-for dept.

PlayStation (Games) 152

An anonymous reader writes "The Last of Us was one of the last major projects for the PlayStation 3. The code optimization done by development studio Naughty Dog was a real technical achievement — making graphics look modern and impressive on a 7-year-old piece of hardware. Now, they're in the process of porting it to the much more capable PS4, which will end up being a technical accomplishment in its own right. Creative director Neil Druckmann said, 'Just getting an image onscreen, even an inferior one with the shadows broken, lighting broken and with it crashing every 30 seconds that took a long time. These engineers are some of the best in the industry and they optimized the game so much for the PS3's SPUs specifically. It was optimized on a binary level, but after shifting those things over [to PS4] you have to go back to the high level, make sure the [game] systems are intact, and optimize it again. I can't describe how difficult a task that is. And once it's running well, you're running the [versions] side by side to make sure you didn't screw something up in the process, like physics being slightly off, which throws the game off, or lighting being shifted and all of a sudden it's a drastically different look. That's not 'improved' any more; that's different. We want to stay faithful while being better.'"

cancel ×

152 comments

Shitty code (-1)

Anonymous Coward | about 2 months ago | (#47030033)

If code was good it wouldn't be difficult to port. It's all their own fault, also it's more or less a x86 Machine and not exactly exotic Hardware.

Re:Shitty code (4, Interesting)

DreadPiratePizz (803402) | about 2 months ago | (#47030045)

You realize the PS3's hardware WAS exotic right? That's exactly why it's hard! Write code optimized for multiple SPE units, and see how well you can get it to run on x86.

Re:Shitty code (-1)

Anonymous Coward | about 2 months ago | (#47030085)

That's easy! I wouldn't try to run SPE units code on x86 but x86 code on x86. The "optimization" shouldn't have affected the Game or it's portability and that's why it's their own damn fault.

PS3 Optimization: Parallelizing code 7 ways (4, Informative)

perpenso (1613749) | about 2 months ago | (#47030119)

That's easy! I wouldn't try to run SPE units code on x86 but x86 code on x86. The "optimization" shouldn't have affected the Game or it's portability and that's why it's their own damn fault.

Optimized PS3 code is not what you think it is. Its not taking some general C code and rewriting it a bit to be friendlier to the underlying CPU architecture or rewriting it completely in assembly. The PS3's Cell processor is, in simplified terms, a general purpose CPU and six special purpose coprocessors. So optimization is really figuring out how to pull major pieces of code out of the general purpose and move it to the appropriate specialized coprocessor, and then add the control code necessary to coordinate the two. Or to put it even more simply, optimizing for the PS3's cell processor is really an exercise in parallelizing code 7 ways.

Re:PS3 Optimization: Parallelizing code 7 ways (-1, Troll)

Anonymous Coward | about 2 months ago | (#47030167)

Sound interessting! Only if it wasn't lazy and shitty coding you could simply remove/ change the optimization to fit the new architecture and the rest still be the same code.

Re:PS3 Optimization: Parallelizing code 7 ways (3, Informative)

Anonymous Coward | about 2 months ago | (#47030371)

Yeah, sure. But no.
Optimizing code for cell means re-writing the code so it operates on small data, and data is transferred explicitly between the cores.
The SPUs have fast communication between some of them, which means that if you run the wrong kernel on the wrong SPU the whole thing slows down dramatically.
This combined with all the explicit DMA transfers you have to do makes the code extremely architecture specific.
Essentially the only code that can stay "the same" is the code running on the MIPS core, which if they really optimized it is only control which kernel is executed on which SPU.

Re:PS3 Optimization: Parallelizing code 7 ways (1)

Anonymous Coward | about 2 months ago | (#47030691)

What a horribly STUPID idea.
You suggest that they should have written all their code in some generic x86 format.
Well guess what, that kind of code runs like SHIT on a ps3. You'd be able to make games that look like pc games from 15 years ago. If you want better stuff you'll need to actually write code for the architecture at hand.
If you write the code for the actual machine then you invarably end up with fundamental adaptations to the engine, in how it handles data, how interupts are handled, how bandwidth is used, how the coprocessors are used, etc.
Making a game like this is like defining a factory process. Only the ps3 is not a x86 factory. It has some of the same modules but everything is specced differently.
With games its not only important IF things get done. You also need to keep a tight schedule on WHEN things get done. And this happens to be something heavily influenced by the architecture and how you use it. Controling this makes that the code will have to be pretty specific to the hardware.
The generic bullshit you talk about can only exist on a higher abstraction level. It is still there and portable, but it plays a much less important role in consoles.

In short, consoles have a different set of requirements to get anything decent out of them. They are desiged to be optimized in the way naughty dog has done and the inportability this leads to is something the industry takes as a fact of life.

Re:PS3 Optimization: Parallelizing code 7 ways (-1)

Anonymous Coward | about 2 months ago | (#47031945)

Sound interessting! Only if it wasn't lazy and shitty coding you could simply remove/ change the optimization to fit the new architecture and the rest still be the same code.

The only shitty part I detect is the one between your ears. Your shit filled brains doesn't seem to comprehend the complexity in coding for the PS3's SPU's.

Re:PS3 Optimization: Parallelizing code 7 ways (5, Interesting)

Animats (122034) | about 2 months ago | (#47030203)

The PS3's Cell processor is, in simplified terms, a general purpose CPU and six special purpose coprocessors.

The "coprocessors" are decent general-purpose computers. The problem is that they only have 128K of RAM each. They can access main memory, with high latency but good bandwidth, through a DMA mechanism. (There's also a relatively conventional NVidia GPU on the back end, so the GPU part of the job isn't the big difference. This isn't about shaders.)

That 128K of RAM is too small for a frame. Too small for a level. Too small for any big part of game state. PS3 programming thus consists in turning a problem into some form of streaming process, where data is pumped into Cell processors, processed a bit, and pumped back out. This is a signal processing architecture. It's great for audio. Sucks for everything else.

The PS3 was a "build it and they will come" architecture. It was cheap to make, but large numbers of smart people spent years trying to figure out how to use the thing properly. (Sony basically gutted their US R&D group to beat on that problem.) In practice, a lot of games did most of their work in the main CPU (a MIPS machine) and the GPU, using the Cell processors only for audio, fire and explosions, particle systems, and other tasks that didn't have a lot of interconnected state.

Re:PS3 Optimization: Parallelizing code 7 ways (4, Informative)

JavaBear (9872) | about 2 months ago | (#47030361)

"This isn't about shaders."

I'm recalling a story about the production of Uncharted 2, also by Naughty Dog, mentioning that this is definitely about the shaders and other graphical effects.

The game engine is not just general gaming code, pulled apart and optimized, it was written for the specifics of the Cell CPU, and the PS3 architecture, then optimized at a byte level. One of the tricks Naughty Dog learned, was to leverage the 7 active SPE's in the Cell CPU to assist the GPU in rendering the scenes.
They have the SPE's do some of the tasks the ageing GPU can't, as well as physics.
Besides, the GP CPU in the Cell is not an x86 CPU core, it's another relic, a PowerPC core, and not a particularly fast one at that, IIRC its primary job is to manage the 7 SPE's

Re:PS3 Optimization: Parallelizing code 7 ways (1)

CronoCloud (590650) | about 2 months ago | (#47031875)

a PowerPC core, and not a particularly fast one at that

It's 3.2 GHz, hyperthreaded, with an Altivec unit. Probably more capable than the CPU's on all those WinXP machines still running. Heck it might even beat the Athlon X2 240 (2.8GHz) I have in this machine.

Re:PS3 Optimization: Parallelizing code 7 ways (4, Informative)

rsmith-mac (639075) | about 2 months ago | (#47030497)

In practice, a lot of games did most of their work in the main CPU (a MIPS machine) and the GPU

Minor correction: Cell's main CPU - called the Power Processing Element [wikipedia.org] - was a PowerPC processor, not MIPS.

Re:PS3 Optimization: Parallelizing code 7 ways (2)

Narishma (822073) | about 2 months ago | (#47030745)

Another correction: the SPEs have 256KiB of local storage, not 128KiB.

Re:PS3 Optimization: Parallelizing code 7 ways (5, Funny)

Anonymous Coward | about 2 months ago | (#47031337)

Still too little. 640k would have been enough for everyone.

Re:PS3 Optimization: Parallelizing code 7 ways (1)

MightyMartian (840721) | about 2 months ago | (#47031677)

I'll come in again.

Re:PS3 Optimization: Parallelizing code 7 ways (0)

Anonymous Coward | about 2 months ago | (#47030513)

This whole thing sounds pretty much like what HSA should be able to support on PCs, including concurrency and stream pipelines. Not that you'd design your code like this, but you should be able to run it. The question is if one can pull off the same on PS4 using whatever APIs are available.

Re:PS3 Optimization: Parallelizing code 7 ways (1)

beelsebob (529313) | about 2 months ago | (#47031121)

The problem with such an approach is that while you can indeed simulate it, simulating it will be inherrently slower than running it on the cell in the first place. While you do on the PS4 have one x86 core per SPU on the PS3, they're running at half the clock speed, and for very specific tasks, the PS3's SPUs had great IPC, so simply getting a core on the PS4 to do the same thing as an SPU on the PS3 will simply be slower.

Re: PS3 Optimization: Parallelizing code 7 ways (1)

loufoque (1400831) | about 2 months ago | (#47030643)

It's not fundamentally different than programming DSPs but for floating point. The Cell isn't that exotic.

Re:PS3 Optimization: Parallelizing code 7 ways (1)

CronoCloud (590650) | about 2 months ago | (#47031853)

In practice, a lot of games did most of their work in the main CPU (a MIPS machine)

The PS3's Cell isn't MIPS, it's PPC.

Re:PS3 Optimization: Parallelizing code 7 ways (1)

fuzzyfuzzyfungus (1223518) | about 2 months ago | (#47030229)

It doesn't help that The Last of Us (for reasons that probably have to do with being published by Sony Computer Entertainment; but may go further, I don't know) was 100% PS3 exclusive, and apparently not built with the expectation that portability would be a consideration.

There are plenty of ways (either through better software design that game development and release timelines probably don't allow) or through heavier use of licensed engines and middleware that do abstraction for you, at a cost in money and potentially quality, to improve portability; but you are less likely to use them if time is short and 'portable' isn't on the list of objectives.

Had this been a cross platform title, they presumably would have just thrown the PS3 version away and worked from pretty much anything else.

Re:PS3 Optimization: Parallelizing code 7 ways (1)

Anonymous Coward | about 2 months ago | (#47030359)

Naughty Dog are a first party developer of Sony. So exclusive here, probably really means platform-exclusive.

So I'd assume at the time of development portability never even came up.

Re:PS3 Optimization: Parallelizing code 7 ways (1)

Anonymous Coward | about 2 months ago | (#47031211)

Their code is pretty much a job queue where some of the code is a SPU job manager, and the dispatchers are SPU jobs. They have two kinds of code, for the PPU and the SPU. Now they pretty much need to write for one processor, the x86. I imagine they are bullshitting us if they find it harder to optimize for a homogenous kind of hardware, and we know very well that most of their code is portable C++ and their scripting is some variant of LISP.

Please, I've thought about this and Naughty Dog are just bullshitting us on the effort.

Re:PS3 Optimization: Parallelizing code 7 ways (0)

Anonymous Coward | about 2 months ago | (#47031997)

You're a dumbass if you think ND is bullshitting people on the difficulty of porting the game. But, then who would believe a worm, like you, that talks shit about something he can't even comprehend because he's so technically illiterate when if comes to programming.

Re:PS3 Optimization: Parallelizing code 7 ways (0)

Anonymous Coward | about 2 months ago | (#47032567)

Yawn.. You're a dumbass who hasn't read anything ever written on the Cell, or by Naughty Dog. So I'll start believing your words when you have something technical to talk about.

Do you even know what reading is?

Re:Shitty code (0)

Anonymous Coward | about 2 months ago | (#47031253)

You don't know anything about programming, do you? Take 50MB of assembly code and try to port it to another CPU architecture. Let me know how it goes.

Re:Shitty code (1)

Anonymous Coward | about 2 months ago | (#47031313)

If you wrote 50MB of assembly code without writing any development/not optimized/debug versions, good luck trying to develop anything substantial with that. AFAIK, most other game studios (Guerrilla, Insomniac) have already been able to ship games on x64 hardware. If you do dig into the details, Insomniac is another large supplier and user of SPU tech.

Re:Shitty code (0)

Anonymous Coward | about 2 months ago | (#47032075)

Sigh, you really are an idiot. ND blows away those other developers in technical competency. Where exactly did you think Sony's ICE team was located, Waldo?

Re:Shitty code (0)

Anonymous Coward | about 2 months ago | (#47032657)

Sigh. The truth is that ND and Insomniac have worked on tech together, much before the PS3. I am just telling you that with those kind of partnerships and years of developing games for multiple consoles and hardware (MIPS, PowerPC and ARM), ND would know better than to write unportable code.

Clearly, you aren't from ND. And if you say you know so much about ND, tell us who you are and offer to AMA - so we know there's an element of possible truth to your arguments.

There's actually some validity to the GP's post. (4, Informative)

tlambert (566799) | about 2 months ago | (#47030141)

You realize the PS3's hardware WAS exotic right? That's exactly why it's hard! Write code optimized for multiple SPE units, and see how well you can get it to run on x86.

There's actually some validity to the GP's post.

Ideally, you would write the game portably, knowing that you will need to potentially take it to market on a lot of platforms, if it ends up being a popular title, and so as a result, you'd have a minimal porting set that could just be compiled and run, with additional optimizations on top of that tuned for the platform on which it's going to run.

Although not done a lot recently, the implementation of the original libc had C versions for all the code contained therein, and then had hand optimized assembly versions that would replace the C versions on a specific platform.

The intent was to be able to get it up and running on a new platform fairly quickly by having a small *required* assembly language footprint in the context switch, bootstrap, and CRT0 code, and then optimize the C code to assembly on a platform specific basis once you wer up and running self-hosted on the platform. This also gave you the opportunity to check assembly optimizations in user space first, without breaking everything by trying something that wouldn't work because of some mistake, and ending up with a lot of work to back the changes out (this was back in the RCS/SCCS days, where source code control systems weren't as capable as they are today).

It makes sense to do the same thing for games; minimally, the complaints they had about shaders should have been totally workaroundable, given that Direct X doesn't allow indefinite termination shaders, and requires the code to be fully unrolled compared to, say OpenGL, where there's no guarantee that a shader terminates (one of the reasons a game can crash a Mac or Linux using OpenGL, but can't crash Windows, using the OpenGL compatibility layer -- if it won't unroll, then it's discarded by DirectX).

In any case, it does show that there were at least some corners cut, and just because the host library is similar, you shouldn't expect the hand tuned code to be at all similar, especially going from a Cell architecture PS/3 (essentially, a data flow processor) to Von Neumann architecture on an AMD processor on the PS/4. It's obvious that all the hand tuned pieces would need to be rewritten, just as if you were porting to Windows or XBox or some other platform that wasn't also Cell-based. You'd think if they had planned ahead for ports to other platforms other than the PS/3, that that planning would be directly applicable to geting the code running on the PS/4 as well.

Re:There's actually some validity to the GP's post (1, Insightful)

Stickiler (2767941) | about 2 months ago | (#47030231)

The main problem with this argument is that The Last of Us was PS3 exclusive, and it was PS3 exclusive from the start, so there's no need to write the code for portability(which inherently means the game will run worse, because you can't use platform specific optimisations). It was only when the PS4 came out the development team considered possibly porting the game, and even then it was never a guaranteed thing. The discussion of whether the game game SHOULD have been ported to other platforms is a different argument altogether. It was only because the game was tuned directly for the PS3 platform that allowed the game to look and play as well as it did.

Re:There's actually some validity to the GP's post (0, Insightful)

Anonymous Coward | about 2 months ago | (#47030277)

The main problem with this argument is that The Last of Us was PS3 exclusive, and it was PS3 exclusive from the start, so there's no need to write the code for portability.

"PS3 exclusive" is a marketing term. YOU ALWAYS WRITE PORTABLE CODE and then you optimize specific path IF AND WHERE NEEDED.

The down-moded AC is right. It's was their own fault and they are awful programmers.

Re:There's actually some validity to the GP's post (5, Insightful)

_Shad0w_ (127912) | about 2 months ago | (#47030349)

Alternatively they're really good programmers who got explicitly told "make this run like shit off a shovel and don't worry about portability - this will only ever be on PS3". You can say "but we should really write portable code", but if SMT still tell you to ignore portability then you're left with either doing what you're told or quitting.

Re:There's actually some validity to the GP's post (4, Informative)

Jesus_666 (702802) | about 2 months ago | (#47030373)

Well, I'm not so sure about that. They designed their game for the horribly quirky PS3, which means that they'd either end up wasting much of the console's power or they'd twist the code until their video game works like a streaming application - which is the only thing the Cell can do efficiently. Since portability was not an issue (they knew the game was a PS3 exclusive) they decided to go with a PS3-specific design in order to get the most out of the hardware, thus making their game more appealing and thus more profitable.

They could decide between "write this platform-specific program in a suboptimal way so that it's easier to port to another platform" or "heavily optimize for the sole target platform in order to increase market success". I don't think the latter was an invalid choice. If anything, this story illustrates just how bad an idea it was to put a Cell in a game console.

Re:There's actually some validity to the GP's post (1)

asmkm22 (1902712) | about 2 months ago | (#47033041)

Keep in mind that Naughty Dog was and is a studio that does Sony exclusives. There was no reason for them to worry about portability, which is exactly why their games were always leading edge in performance. That studio, above all others, knew how to use every last resource that the PS3 offered.

Re:Shitty code (1)

DrXym (126579) | about 2 months ago | (#47030499)

It was exotic for the time, but most of the principles are little different from GPGPU programming. You write programs called kernels that are loaded onto the SPU, you feed them data, the kernel processes the data and the result is used for something else. Nowadays people write similar programs using OpenCL or CUDA.

I expect the main difficulty in porting from PS3 to PS4 is the effort of converting all those programs between two broadly similar but different in the detail systems. The GPU shaders too were written for NVidia processors and now have to be converted / rewritten. And working through all the hacks, shortcuts and bad assumptions that come from dealing with code which has been crunched out to work on one specific platform and not another.

So it's mostly drudgery. I expect virtually all of the assets and much of the higher level code is reusable.

Re:Shitty code (1)

Zembar (803935) | about 2 months ago | (#47030047)

The PS4 is x86, the PS3... not so much.

Anything assembler-level optimized for the cell architecture would be absolute hell to port.

Re:Shitty code (2)

Tablizer (95088) | about 2 months ago | (#47030079)

I don't know about that. Sometimes there is an inherent trade-off between being machine-friendly and human-maintainer-friendly. Tuning for machine performance sometimes gets in the way of high-level abstractions that make porting to a different architecture easier.

Reminds me a bit of the Story of Mel:

http://www.cs.utah.edu/~elb/fo... [utah.edu]

Re:Shitty code (0)

Anonymous Coward | about 2 months ago | (#47031047)

I love this part

The code optimization done by development studio Naughty Dog was a real technical achievement — making graphics look modern and impressive

Uh, are we looking at the same game, because from what I can see, it looks like complete ass. It does not look modern and it does not look remotely impressive. Looking at the shit lighting, the low poly counts and the blurry textures, this looks like a game from 2006.

Re:Shitty code (2)

beelsebob (529313) | about 2 months ago | (#47031075)

No... You're not talking about some code that's designed to be portable, you're talking about code that was targeting exactly one piece of hardware, and trying to make the output look better on that exactly one piece of hardware than anyone else developing for that same exact one piece of hardware. Because of this, games for consoles have an insane level of optimisation for that one piece of hardware. More so in this case, you're talking about a piece of hardware that had one of the most exotic CPUs in decades. Porting from that to x86 is a huge challenge.

Re:Shitty code (0)

Anonymous Coward | about 2 months ago | (#47031165)

This is why I don't fault developers for not coding "directly to the hardware" anymore. So much extra effort (which means release delays, higher costs, tougher bug fixing) for minimal gain.

This isn't 1980 anymore. You don't have to eek out every last CPU cycle to get a playable game. Focus on the game design, leave the optimizations to the compiler.

Porting is news for who? (-1)

Anonymous Coward | about 2 months ago | (#47030049)

How is a studio porting a game to x86 hardware news? I mean, I do feel for Naughty Dog on this one - they've had the luxury the last several years of producing platform exclusives that let them completely avoid any competency porting their games to the most common & capable hardware on the planet. Pity pity.

Re:Porting is news for who? (1)

Stickiler (2767941) | about 2 months ago | (#47030077)

Mainly because the PS3 was a radically different architecture, and when the company had to do assembler-level optimisations to get the game working well, and now they're trying to recode those optimisations for x86, while ensuring the shaders, physics, lighting etc all work perfectly, it's an impressive project.

Its not about assembly code ... (2)

perpenso (1613749) | about 2 months ago | (#47030139)

Mainly because the PS3 was a radically different architecture, and when the company had to do assembler-level optimisations to get the game working well, and now they're trying to recode those optimisations for x86, while ensuring the shaders, physics, lighting etc all work perfectly, it's an impressive project.

Its not about assembly code, this mischaracterizes the problem.

Optimizing for the PS3's Cell architecture is not simply rewriting some critical code in assembly. Its more of a parallel processing effort. To greatly simplify things the Cell has a general purpose CPU and six specialized coprocessors. The trick to Cell optimization is moving code from the CPU to the coprocessors and keeping those coprocessors as busy as possible. Now add the complication that the coprocessors are not interchangeable.

Assembly has little to do with it. Its architecting the code to keep 7 parallel processors going, whether the underlying code that implements this software architecture is C or assembly doesn't matter a whole lot.

When you consider the fact.... (1)

Anonymous Coward | about 2 months ago | (#47030303)

That the PS4 has the same or greater number of cpus as the cell had cpu+spu, plus the ability to use compute units on the GPU as 'simplified' SPUs, it becomes much more difficult to understand how these guys had significant trouble re-optimizing their code for the PS4's architecture. Now mind you there ARE places they could've been having stalls (since the AMD jaguar cores share resources per pair(like bulldozer, correct?)), but most of the places where such issues would come into play should be optimizable into OpenCL datapaths and thus offload enough threads from the cpu for it to be acting like a 'native' quad core, with some number of Cell-like processing streams to hand the less general purpose computations.

While programming and game programming are difficult, and can in some cases compare to rocket science, this really isn't one of them. This is more about identifying the right resource or tool for the job and utilizing it in the most efficient manner possible.

Having gotten looks at plenty of 'production' game source code, mostly open sourced, I am sure plenty of you know how much more akin to the OpenSSL library's code than say 'an idealized, portable, and properly documented codebase'.

Re:When you consider the fact.... (0)

Anonymous Coward | about 2 months ago | (#47030531)

since the AMD jaguar cores share resources per pair(like bulldozer, correct?)

Incorrect - unless you count shared L2 caches. But even there, every Jaguar core on the PS4 SoC has twice as much L2 cache for itself (512k) than every PS3 SPE had total memory (256k).

Re:Porting is news for who? (1)

aliquis (678370) | about 2 months ago | (#47030199)

Seem to me like there's some pretty good advantage in doing games for x86 and Windows .. (but maybe not as strong if you optimise at this level.)

Super stoked (0)

Noah Haders (3621429) | about 2 months ago | (#47030069)

I'm super stoked about this game! Never had a os3 so I missed out on it before.

Re:Super stoked (0)

antdude (79039) | about 2 months ago | (#47030125)

os3? OS/3 after OS/2? :P

Own medicine (0)

Anonymous Coward | about 2 months ago | (#47030071)

That is what you get if try to lock-in customers into a restricted platform. What about all those games consumers bought for their ps3 that are now obsolete.

Re:Own medicine (0)

Anonymous Coward | about 2 months ago | (#47030109)

We still play them on our PS3's?

Re:Own medicine (1)

xyzzyman (811669) | about 2 months ago | (#47030113)

How are the games obsolete? I bought a PS3 in September and have over 20 games on disk along with a bunch of digital ones and I love it.

Re:Own medicine (1)

aliquis (678370) | about 2 months ago | (#47030201)

He exageratted.

But over in Nintendo camp supposedly it's bye bye for all online multiplayer gaming for the Wii and DS now.

(Well, that have happened on PC too, but it's still due to vendor lock-in at least :))

Re:Own medicine (1)

donaldm (919619) | about 2 months ago | (#47031441)

How are the games obsolete? I bought a PS3 in September and have over 20 games on disk along with a bunch of digital ones and I love it.

Not only that but if you have PS1 games you can still play them on your PS3, but with the exception of a few PS1 games you probably won't. Unfortunately only the first FAT PS3's played PS1, PS2 and of course PS3 games, however if you compare graphics on a decent large HDTV most PS1 games are fairly grainy even with smoothing on, the PS2 (FAT only) is quite reasonable and many games are very playable.

It is unfortunate but as time goes by the only way to play older PS1, PS2 and eventually PS3 games is to run them in an emulator. This does not only apply to PlayStation games but all games that ran on a different architecture.

At the moment I have no intention of purchasing a PS4 since there are not many games I see for it that makes it a compelling purchase for me. Backwards compatibility would have made for a compelling reason since I have many PS3 games I am still interested in purchasing as well as ones I am currently playing and since I like RPG's and Action Adventure I have enough for 100's of hours of game-play before I even consider getting a PS4.

Re:Own medicine (0)

Anonymous Coward | about 2 months ago | (#47030137)

Go to bed, RMS.

So someone didn't follow the practice ... (1)

Ihlosi (895663) | about 2 months ago | (#47030205)

... of making the software run correctly first, and only then doing optimizations (down to the assembly level)?

Sorry, but *yawn*.

Had they followed the practice, they would have a version of the source code that runs correctly (but slowly) that they could optimize for different target platforms.

Re:So someone didn't follow the practice ... (0)

Anonymous Coward | about 2 months ago | (#47030335)

... of making the software run correctly first, and only then doing optimizations (down to the assembly level)?

Sorry, but *yawn*.

Had they followed the practice, they would have a version of the source code that runs correctly (but slowly) that they could optimize for different target platforms.

Sorry but *yawn*
Have you ever tried to port a PHP project to java MVC? Yes, is that easy. Yawn.

5, Insightful? More like 5, Ignorant. (0)

Anonymous Coward | about 2 months ago | (#47030415)

Coding for the Cell is a pain. It has six coprocessors with limitations you can't ignore. People who never worked with heterogeneous architectures may think you can implement a non-optimized design and achieve good performance, but you can't for anything big like a game. If it was launched only on PS3 at first, it wouldn't even make much sense to worry about porting unless they knew it'd sell.
Besides, they already have a non-Cell version on PC, assuming this is the same game.

Re:So someone didn't follow the practice ... (5, Insightful)

Anonymous Coward | about 2 months ago | (#47030441)

Is it a good practice for cases like these? I argue not.

Let's say that we do create a reference version, then optimize. Since we are intending to push the hardware to its maximum, we have to assume that we will hit the occasional performance wall. How do we deal with that? We change the behavior of the program to fit within the limitations. This means that our definition of correct behavior has changed and the reference version is no longer correct. So, we update our reference version to match the version we plan to publish. This involves backporting the changes, then carefully testing to verify that the behavior is exactly the same as in the optimized version.

We're left with the question: Why do we call it a reference version if it is derived from the version that is supposedly derived from it?

The optimized version is the real reference version. The "reference" version is really just a port to a hypothetical platform. And, rather than just porting the final version, we are porting every bit of wasted effort along the way.

We get all the cost of the PS3 to PS4 port, dragged out over the whole path of development, with no target platform to sell it on. Sure, porting "reference" to PS4 will be cheaper than PS3 to PS4; however, PS3 to reference to PS4 will be much more expensive than directly porting PS3 to PS4.

So, best case, we spend more money to save time porting to a platform we never intended to support. Worst case, we spend a lot of money on a port that doesn't go anywhere. It's a lose-lose.

Re:So someone didn't follow the practice ... (1)

KliX (164895) | about 2 months ago | (#47030491)

The optimisations are the program, there is no unoptimised state to work from. Have you ever worked with soft real-time code?

Re:So someone didn't follow the practice ... (0)

Ihlosi (895663) | about 2 months ago | (#47030895)

The optimisations are the program, there is no unoptimised state to work from.

And that is the mistake. You first need something that actually runs correctly, and then optimize it to work with the hardware that is at your disposal.

Have you ever worked with soft real-time code?

Part of my work involved hard real-time code, down to twiddling with sub-microsecond timings and of course counting CPU cycles. Still, I usually start with code that runs correctly (i.e. it would fulfill the specifications if run on an infinite-horsepower CPU) and then optimize it to work with the hardware.

Re:So someone didn't follow the practice ... (0)

Anonymous Coward | about 2 months ago | (#47030991)

So, you did this with actual games? With the same level of complexity?

Re:So someone didn't follow the practice ... (2)

west (39918) | about 2 months ago | (#47031295)

I'd suggest that the PS3 is radically different enough that if your *reference design* wasn't PS3 specific, you've probably already failed.

You can't "optimize" a bubble sort into a quick sort.

(One interesting effect of both the PS2 and PS3 was that their design was so bizarre that it took years for programmers to be able to optimize effectively. This meant that games were consistently better every year even without any changes in the hardware. With a more conventional architecture, what you buy in the first year will essentially be state of the art for the life of the console. Makes it much harder to get people excited about "this year's version of xxx".)

Re:So someone didn't follow the practice ... (1)

Ihlosi (895663) | about 2 months ago | (#47031975)

You can't "optimize" a bubble sort into a quick sort.

When you're deciding on a sorting algorithm, you're already optimizing. The unoptimized version just needs to run correctly, i.e. deliver the expected result. You pick a search algorithm that's easy to understand for that (so, bubble sort rather than quick sort). Chosing one that is best suited to the target hardware and implementing it optimally on the target hardware is part of the optimization process.

Re:So someone didn't follow the practice ... (0)

Anonymous Coward | about 2 months ago | (#47030535)

lol, wow, hi Comic Book Guy.

The issue is that the PS3 is such a different architecture, and so fiendishly difficult to optimize for, that you can't go the route you are advocating and end up with the results they did on the PS3. If you are talking XBOX to Gamecube, or XBOX to PS4 sure, but the PS3 is a really unique beast.

Re:So someone didn't follow the practice ... (3, Informative)

_xeno_ (155264) | about 2 months ago | (#47030593)

Had they followed the practice, they would have a version of the source code that runs correctly (but slowly) that they could optimize for different target platforms.

I expect that when they started, they had no intention of porting to other platforms.

Naughty Dog is Sony these days. They only make games for Sony platforms. So they targeted only the PS3. I'll bet when development started, Sony hadn't finalized PS4 plans.

Now the PS4 is out and desperate for games (go ahead, name a PS4 exclusive), so Sony is having them port it to PS4. And since the game was never intended for anything other than the PS3, they're running into difficulties.

I wouldn't blame the programmers for optimizing for the only platform they were told to target, I'd blame the managers for suddenly springing a new platform on them after the game was done.

Re:So someone didn't follow the practice ... (0)

Anonymous Coward | about 2 months ago | (#47030685)

The Playstation-has-no-games meme was true back when the PS3 first came out, but it ended up with significantly more (and often better) exclusives than the 360 - hell, their camera & motion-tracking add-on had more and better exclusives than the 360's kinect. Of course, since memes never die, the next-gen console with more exclusives is being mocked for having no games. What's next, people claiming the xbox controller is huge for people with huge hands?

Re:So someone didn't follow the practice ... (1)

donaldm (919619) | about 2 months ago | (#47031503)

The Playstation-has-no-games meme was true back when the PS3 first came out, but it ended up with significantly more (and often better) exclusives than the 360 - hell, their camera & motion-tracking add-on had more and better exclusives than the 360's kinect. Of course, since memes never die, the next-gen console with more exclusives is being mocked for having no games. What's next, people claiming the xbox controller is huge for people with huge hands?

Now that leeds to the following cartoon [penny-arcade.com] :).

Re:So someone didn't follow the practice ... (0)

marsu_k (701360) | about 2 months ago | (#47030733)

Now the PS4 is out and desperate for games (go ahead, name a PS4 exclusive)

Easy. Infamous: Second son. Which apparently sold better than Titanfall on Xbone [theverge.com] .

Re:So someone didn't follow the practice ... (1)

_xeno_ (155264) | about 2 months ago | (#47032741)

Except I've heard of Titanfall, and I though "Infamous: Second Son" was the PS3 sequel to Infamous. Apparently it's in fact the third game in the series, proving just how good Sony is at both marketing and naming things.

Hilariously, when I Googled it to find that out, the top hit was "How Second Son Really Hurt The inFamous Franchise [onlysp.com] " so I'm not sure that's really a good exclusive to have...

Plus your own link points out Titanfall sold better than Infamous Second Son, it just did it across more platforms because it isn't a Xbox One exclusive, leaving the Xbox One version sales alone to do worse than Infamous Second Son.

Re:So someone didn't follow the practice ... (1)

marsu_k (701360) | about 2 months ago | (#47032783)

I don't know what kind of personalized settings you have on your Google account, but I'm unable to find your link from the first four pages of my results (I never venture any further, usually never past page two even).

Re:So someone didn't follow the practice ... (2)

Ihlosi (895663) | about 2 months ago | (#47030901)

I expect that when they started, they had no intention of porting to other platforms.

Oh, yeah ... that is the other mistake. "No one's ever gonna look at this code again, so it's okay if it's an incomprehensible, unmaintainable mess. Ship it!".

Re:So someone didn't follow the practice ... (0)

Anonymous Coward | about 2 months ago | (#47031243)

That isn't the same at all. You can program for a specific platform using perfectly good code, and have it difficult to use on another platform.

Re:So someone didn't follow the practice ... (0)

Anonymous Coward | about 2 months ago | (#47031309)

I can only dream of making a "mistake" that sells over 6 million copies at $60 a pop...

Talk about missing the forest for the trees.

Re:So someone didn't follow the practice ... (1)

ultranova (717540) | about 2 months ago | (#47032339)

Oh, yeah ... that is the other mistake. "No one's ever gonna look at this code again, so it's okay if it's an incomprehensible, unmaintainable mess. Ship it!".

But that's not what the summary said. It said the code is specific to a particular architecture, and there's considerable effort required to port it to a completely different architecture.

It doesn't matter how neat your C program might be, moving the computations to OpenCL is still going to be a pain.

Re:So someone didn't follow the practice ... (1)

Ihlosi (895663) | about 2 months ago | (#47031943)

I'd blame the managers for suddenly springing a new platform on them after the game was done.

This is normal manager behavior. An experienced programmer plans for it, or makes sure he's not the one who has to adapt the old code to the new platform.

Re:So someone didn't follow the practice ... (2)

CronoCloud (590650) | about 2 months ago | (#47031963)

Now the PS4 is out and desperate for games (go ahead, name a PS4 exclusive)

Umm wait, I think I can name one..ummm. Resogun? Ummmm Infamous Second Sun? And there's a Brown Shooter or two I can't remember the name of. [looks over at the PS4 sittingmostly idle because of the lack of games in genres I want to play....maybe I should have got a Vita instead. I do play DCUO on it now and again....but that is not exclusive the PS4]

Re:So someone didn't follow the practice ... (1)

VortexCortex (1117377) | about 2 months ago | (#47030639)

Correct First, Clever Later is my core philosophy. There is nothing I don't abstract via portability layer.

I wrote a "hobby" OS kernel from scratch, starting only with a bootable hex editor on x86. The FIRST thing I did was create a simple bytecode VM to call into a function table to give me ASM level abstraction, before I even switched out of realmode to protected mode. THEN I created the bootloader, a simple text editor, assembler, disassembler, a crappy starter file system, and that was enough to bootstrap into a a full OS completely free from the Ken Thompson's Unix Compiler Hack. [bell-labs.com] A bytecode to machine code compiler was one of the very last things that got written (it was boring). All programs on my OS (yes even Java or C code) compiles into bytecode and the kernel can link programs into machine code at "install time" or include the (shared mem page) VM stub and run them as emulated -- Which is nice to provide application plugins, scripting, or untrusted code. Since the kernel is compiling things it can do far more checks on the validity of binary code, as well as optimizations. Since multiple languages can compile to the same bytecode I can transparently integrate libraries written in different languages instead of each language needing it's own implementation of basic things like big-integer math. I can even change calling conventions on the fly for transparent RPC.

For Gravmass last year I got some nifty embedded ARM systems to upgrade a few of my my x86 robotics projects with (for better battery life). Instead of building a cross compiler (which takes a while) and compiling the OS components into ARM from x86, I just implemented the bytecode VM on ARM in one weekend, and I had my full OS software stack up and running (albeit slowly since I lacked native compilation).

I took the time to do things right so porting painless and it was actually fun to migrate to a whole new architecture. Then I was free to debug, build, and test stuff right on the hardware over serial console. Now I've got a project to build a usable computer from scratch (a transistor version of a CPU I built from about 400 contactors) and the new system will use my VM bytecode as its native machine language, so I won't have to write a compiler. I'm just taking things nice and easy, even teaching kids and neighborhood enthusiasts CS101 as I go. Man it sure is nice to have as much time as I want on these hobby projects -- Time to actually do things absolutely correct first, then as clever as I dream of later...

However, imagine a Publisher is breathing down your neck to get things up and working and done before a deadline. They don't care about you making it harder to porting your code to a new system, right now what matters is getting your basic engine up and running pronto so other folks can start working with their content on the dev kits / consoles. Now imagine even if you started out with a cross platform system, you've got to squeeze a new feature in and it needs to work fast, and yesterday -- No, you don't have time to write a slow version and a fast non-portable version and you've made this sort of addition so many times that it's hasn't been possible to run the slow portable code with the latest assets in over a year.

Yeah, the GPU and shaders aren't all that different but this new platform is the difference between porting a program design from Erlang to a multi-threaded C program. Getting an image up isn't just getting a spinning flat-shaded teapot displayed -- You've got a bunch of code in your asset loading system to handle your data storage format before you can even get started, and both of these were optimized for the CELL processor's big endian while the new platform is little endian. The asset streaming was probably done in several stages distributed among the CPUs, and operating on batches of data a few KB at a time. I've got no doubt that before you could see a properly textured and lit wall there'd be some serious lifting to be done. Floating point formats might not even be directly convertible since IEEE-754 standard doesn't say shit about byte order. One mistake anywhere throws a matrix off or gorks vertex order and you're looking at nothing with no clue why. You can pray that your SDK has a debugger for the GPU... but it probably won't since this is a new platform and the debugger you wrote for the previous system may not ever work on it properly. Which is why some devs shed a tear of joy when Valve opened their VOGL debugger [github.com] .

Regular software devs don't face the same crap that devs do trying to debug games - You can probably even fire up an emulator, pause and dump machine state. Graphics programmers are largely going in blind. If something doesn't work you hope it's something you just did now and not some subtle bug introduced a week ago that you're running over. Remember before you learned to use a debugger, and you'd just print messages to the console? Yeah, well, debugging on a GPU these days is worse than that, we don't even have a console log to print out. I've taken to debugging shaders separately from the main program [uni-stuttgart.de] , but when something goes wrong I can't debug externally I'm pretty screwed. even more so if the project is already got a lot of complex things going on beyond the spot I'm testing.

I'm sure once they got a model up on the screen the going was a lot smoother working with code in the high level language, maybe re-implementing any missing code that existed as just an ASM routine. The meat of the matter is that they'll have to re-architect the whole system. Even if they had a perfect cross platform "version of the source code that runs correctly (but slowly)" they'd still have to playtest side by side to detect any subtle lighting, texturing, rigging glitches in the data itself, especially if they use any higher res textures or slightly better shaders on the new platform -- better to change these now rather that do all the tests twice. Off by one and just one light is missing that the level designers and artists created a scene around, it's not the end of the world, but you can't really chance it.

Games just aren't the same beasts as business software. Even from the less complex embedded world, I know for a fact there's quite a few MIPS systems out there running a hacked together FORTH interpretor as their bootload / init process, and some of theses things control industrial machines. Even if I got the time to fix things, no amount of begging will get the boss to agree to spend time on making things forward portable, they'd rather charge that to the future guys that have to fix the mess. It's working? You're done.

Lots of gamers understand that the CPU architecture between PS3 and PS4 is different, but many don't understand just how much different it actually is, and how much of a pain in the ass debugging graphical applications is. Personally, I like things like TFA. Used to be players didn't get to know anything about how games were developed at all. Now there's a little bit more info leaking through. Hopefully publishers will eventually take a page from the openly developed indie scene, maybe then they can get some feedback BEFORE they make a flop. Maybe market forces will push towards a more open model of development. What I think is kind of fun is that these programmers are getting a taste of what the testing department goes through, heh.

In other words: Yeah, they fucked up under pressure, but that's par for the course. You might want to pay attention to the difficulties of porting between such parallel systems, since material science isn't exactly keeping up with processor speed. I mean, Look who's talking: Your kernels don't have bytecode compilers so you can't even do basic tasks like copy all of your programs between your desktop, phone, tablet or robot and run them on whatever machine you have. Even your VM OSs like Android aren't compiling once at install time, so they don't leverage full native speeds. You think hardware migration woes inexcusable? Well your program doesn't even run in more than one language. Porting applications between C, Android Java, HTML+JS and ObjC means a complete rewrite for most folks since you don't even write programs in a cross-platform meta-language and compile down into target languages with a compiler-compiler. Anyone who's been around long enough knows that languages can come and go. Best hope when Moore's Law bottoms out around 2020 (due atomic sizes) that we don't all migrate from C/C++ to something like Erlang for system level programming in order to leverage all of the many cores we'll need to go faster, otherwise you'll appreciate the pain of porting too.

My hope for humanity certainly isn't shaken by gamedevs under pressure since it wasn't destroyed by you "Language Level" coders who poke fun at "Assembly Level" coders while you're in the same damn boat just crossing a bigger pond -- ask folks who ported codebases from COBOL to FORTRAN to C, etc. Hell, some folks are STILL in that process. Be smug all you want, but you're still just 'the pot calling the kettle black' to me until you can figure out how to write your C program such that you don't have to rewrite it in Python, Erlang or Go, or recompile it to run it on another system.

Re:So someone didn't follow the practice ... (1)

Threni (635302) | about 2 months ago | (#47032425)

No, that's what you do when you're writing something easy any entry level developer can do, like most of the shit enterprise software companies insist on writing. It doesn't work when you've got a cutting edge console with alien technology and a limited number of developers capable of writing for it. This isn't exactly Stack Overflow "pls help i am writing anroid app but i am getting teh crash plz to give me tutorial" territory.

Re:So someone didn't follow the practice ... (0)

Anonymous Coward | about 2 months ago | (#47032747)

... of making the software run correctly first, and only then doing optimizations (down to the assembly level)?

Sorry, but *yawn*.

Had they followed the practice, they would have a version of the source code that runs correctly (but slowly) that they could optimize for different target platforms.

You've quite obviously never worked in embedded systems so your opinion is valueless.

The Cell processor strikes again! (0)

Anonymous Coward | about 2 months ago | (#47030217)

Anyone remember how many cruise missiles could that thing power?

The problem with today's games (0)

Anonymous Coward | about 2 months ago | (#47030269)

Lots of graphics eye candy, little of play (and replaying) value.

Too much focus on online play, and DLCs

Re:The problem with today's games (1)

aliquis (678370) | about 2 months ago | (#47030761)

What about all the indie bundles / small games?

Surely you can find a shitload of platformers, shoot 'em ups, puzzle games, physics games, space development games and so on to fill your time with?

Police Abuse, Second Amendment, Evidence Tampering (-1)

Anonymous Coward | about 2 months ago | (#47030271)

FriscoPaul highlights malfeasance by Frisco Texas Police Department and malicious prosecution by Collin County Prosecutors. We now know that tampered evidence was used at trial, this has been confirmed by multiple experts.

In other words... (0)

Anonymous Coward | about 2 months ago | (#47030291)

Don't expect a discount just because it's an old title.

How to whinge and blow your own trumpet at the sam (0)

Anonymous Coward | about 2 months ago | (#47030309)

It's your job, just get on with it, maybe write the code with portability in mind next time?

I doubt it would be terribly harder (0)

Anonymous Coward | about 2 months ago | (#47030543)

There are excellent books writen by Engineers at Naughty Dog (such as Game Engine Architecture) which describe the kind of abstractions most engines perform. The multithreaded job handlers are possibly what would have to be x64 optimized, while most of the general code would remain intact.

Anyone familiar with Naughty Dog's GDC presentations would agree.

PS4 hardware (1)

PFritz21 (766949) | about 2 months ago | (#47031343)

Why didn't Sony make the PS4 hardware more like that of the PS3? Then less effort would be needed to ensure PS3 games are portable to the newer system.

Re:PS4 hardware (2)

drinkypoo (153816) | about 2 months ago | (#47031701)

Because the PS3's hardware was stupid. It never paid out the kind of performance dividends they claimed it would, and to even approach them took insane amounts of effort.

Re:PS4 hardware (0)

Lisias (447563) | about 2 months ago | (#47032001)

Because the PS3's hardware was stupid.

No, [wikipedia.org] it [umassd.edu] was [wired.com] not. [universetoday.com]

Re:PS4 hardware (4, Insightful)

drinkypoo (153816) | about 2 months ago | (#47032143)

Because the PS3's hardware was stupid.

No, it was not.

Yes, yes it was. I didn't argue that cell had no purpose, but it was stupid to use it for gaming. And sadly, cell was only ever widely used in scientific computing in PS3s, because the only other way to get it was madly overpriced. Then Sony removed OtherOS and the supply of PS3s usable for scientific computing was constrained, although by that time the cell had been far surpassed.

Sony gained dominance in part because development for the Playstation was simpler than development for the Saturn or the Jaguar. Microsoft gained a foothold in part because development for the PS2 was a super bitch. Then Sony went on to make another console for which development was difficult, and the 360 became the dominant console of its generation. Microsoft probably wouldn't even be in gaming today if Sony had adopted a more conservative architecture for the PS3. It took them until the PS4 to figure that out, and they did indeed finally manage it — by making the same console Microsoft was making, or vice versa.

Re:PS4 hardware (0)

Anonymous Coward | about 2 months ago | (#47032321)

and the 360 became the dominant console of its generation

In the US maybe, but worldwide the PS3 has sold more - outside the US the PS3 has been dominant for quite a while.

Re:PS4 hardware (1)

Lisias (447563) | about 2 months ago | (#47032417)

No, it was not. =D

You make a very poor judgment of the last decade developers. And somewhat lack of history knowledge.

What did PS3 a good start was the huge Microsoft failure on delivering a faithfully working hardware (that Red Ring of Death issue, remember?). Any other company would had fold, but Microsoft had (and still has) a huge cash cow to milk (Microsoft Office), and that was the sorely reason XBox didn't fold at that time.

But Microsoft people aren't stupid (for the most part, at least), and XBox was fixed. And they did a really big thing: the Microsoft Live. This really changed the game, and Sony got a really nasty bite in the ass. Sony took too much time to get the PSN up to the Live level (still doing it by the way).

By the way, The XBox 360 is also a PowerPC machine - the same core used on sony's PS3. And the NIntendo Wii (and also the WIi-U). And also on GameCube, Apple's failed Pippin and a lot of others dead on arrived videgames. There's absolutely nothing weird on PowerPC being used on videogames. Sony made things complicated for developers by tacking 6 specialized coprocessors inside its chip - what, for programmers used to have just one (MMX, 3DNow!, etc), was clearly a new level of computing. However, parallel computing appears to be here to stay (look at the ARM chips) - and having EIGHT general purposes CPUs competing for the same resources is not for the faint of the heart (Microsoft took years to learn, look at that crappy piece os software called Microsoft Windows). The complexity is still there - we just shifted it to another side.

Videogame Makers choose their hardware based on price, power and next years availability. Power PC had won the maker's heart in the past, but by some reasons IBM choose to abandon this race, giving the PowerPc a low priority on to research and development. AMD, on the other hand, spend a lot of efforts and money by upscaling their CPU to the current levels. The decision on using AMD's x64 over the previous PowerPC one was taken based on CURRENT chip power, CURRENT chip pricing and guarantees for chip supply for the product's lifetime.

The current new chip is easier to program? Beneficial side effect, nothing more. Not a single videogame maker will sacrifice any of the previous requirements to make the developer's life easier. We are not in the 90's anymore, there's a lot more people with programming skills nowadays, and a lot of them will be willing to deal with any extra complexity to take your job from you. (sad but true)

Re:PS4 hardware (1)

Baloroth (2370816) | about 2 months ago | (#47032855)

There's absolutely nothing weird on PowerPC being used on videogames.

No one is saying using a PowerPC-based chip was stupid. Virtually everyone is saying using a Cell-based chip was stupid. You automatically lose performance relative to your competitors on games that don't take full advantage of the Cell architecture, which is precisely those multi-platform games where people can directly compare performance on the PS3 with performance on the Xbox. This article is a testament to the code specialization required to take full advantage of the architecture, and game developers simply weren't willing to put in that kind of effort (especially for a console that sold more for it's ability to play Blu-ray discs than it did for it's gaming capabilities). Often, even PS3 exclusives didn't utilize the Cell properly: it simply took too much work on an architecture few developers were familiar with (while PowerPC based, the SPE co-processor design means you have to use radically different techniques than you would for a normal PowerPC system).

Car analogy time: it's like giving a bunch of drivers who don't know how to use manual transmissions a manual car. Yes, manual transmission is faster than automatic, but if your drivers don't know how to use it properly, it's always going to end up being slower in practice.

Re:PS4 hardware (1)

Lisias (447563) | about 2 months ago | (#47032009)

Because IBM didn't improve the PowerPC processor line since them, while Intel, AMD et all spent a lot of money on the x86_64 architecture.

In the end, it's not what the architecture did in the past - what matters is what the architecture will do in the future. Now, x86_64 is far more capable than the CELL architecture. So, if you want to build a top performance machine today, you will go with x86_64.

Wish they'd have thrown PCs a bone (1)

Cito (1725214) | about 2 months ago | (#47031467)

Last consoles I owned was Atari 2600 & original Nintendo.
I've been a PC gamer since.

Plating Ghostbusters & Chuck Yeager's Flight Simulator with joystick on a Tandy 1000 EX was fun, and all the free games in basic that came in the old magazine "Home and Office Computing" which became Family Computing (http://en.wikipedia.org/wiki/Family_Computing)

Anyhow I've never been a fan of consoles and while that might make me in the minority I still wished they'd ported to PC.

Someone posted the entire last of us game with no commentary and all the cutscenes a perfect play through I enjoyed like a movie.

I did same for Beyond Two Souls, entire game posted as a movie in HD with zero commentary. 8 hours 41 minutes straight through.
http://youtu.be/9qolJTsmmWA [youtu.be]

Re:Wish they'd have thrown PCs a bone (1)

CronoCloud (590650) | about 2 months ago | (#47032013)

and all the free games in basic that came in the old magazine "Home and Office Computing" which became Family Computing (http://en.wikipedia.org/wiki/Family_Computing)

You got that backwards, it was Family Computing first with the basic listings and game reviews and "full home computer coverage, not just covering x86 only"...and then later became Family and Home office computing, adding more articles for the "wannabe home-office entrepreneur" and then becoming "Home office computing" and dumping anything not home-office and IBM PC related.

poor game (0)

Anonymous Coward | about 2 months ago | (#47032379)

Why waste the time on a crummy game? The story and cinematics were nice but the gameplay and battle scheme was horrible. I was glad to get rid of this game

graphics, who cares (1)

jonthomson (862813) | about 2 months ago | (#47032603)

"making graphics look modern and impressive on a 7-year-old piece of hardware" - for me, graphics have looked as good as they've ever needed to be since, ooh, Metroid Prime? This is kind of detailing exactly why modern games suck, I don't care if Call of Duty 87 looks 63% more realistic than the previous one, they're not interesting to play

So much effort (1)

50000BTU_barbecue (588132) | about 2 months ago | (#47032785)

and brains and talent and resources... for a fucking video game. I wonder if people a hundred years from now will laugh at us or hate us for the squandered resources?

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...