Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel To Design PlayStation 4 GPU

ScuttleMonkey posted more than 5 years ago | from the wtb-cash-cow dept.

Intel 288

madhatter256 writes "According to the Inquirer it looks like Intel will be designing Sony's next gen console GPU. It will most likely be an off-shoot of the Larrabee GPU architecture. It is also unknown as of yet if Intel will also take part in the CPU design of the console. Due to current economic times it was a no brainer for Sony to go with Intel. " The article also mentions rumors of ATI getting the Xbox3 GPU and, if history is any judge, the Wii2 as well.

cancel ×

288 comments

Sorry! There are no comments related to the filter you selected.

intel inside (-1, Troll)

ionix5891 (1228718) | more than 5 years ago | (#26757333)

first psot :P

Re:intel inside (-1, Troll)

geekoid (135745) | more than 5 years ago | (#26757345)

BEEF FEED

im full of jokes today (5, Funny)

ionix5891 (1228718) | more than 5 years ago | (#26757369)

that should be Playstation 3.99967873 :P

Re:im full of jokes today (5, Insightful)

rcuhljr (1132713) | more than 5 years ago | (#26757915)

Shouldn't they put out games for the PS3 before they make the PS4?

Grammar Junta, attack! (4, Funny)

RyanFenton (230700) | more than 5 years ago | (#26757367)

>> Wii2

Sheesh - The proper term is WiiAlso.

Re:Grammar Junta, attack! (4, Funny)

frosty_tsm (933163) | more than 5 years ago | (#26757595)

>> Wii2

Sheesh - The proper term is WiiAlso.

Not the WiiWii?

Re:Grammar Junta, attack! (3, Funny)

RyanFenton (230700) | more than 5 years ago | (#26757647)

Not the WiiWii?

That's a limited european marketing name. And it's spelled OuiiOuii.

Re:Grammar Junta, attack! (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26757853)

More like PooPoo. Speaking of which, I'm off to take a shit!

Re:Grammar Junta, attack! (1)

eldavojohn (898314) | more than 5 years ago | (#26757607)

>> Wii2

Sheesh - The proper term is WiiAlso.

And the Xbox3? It's also kind of confusing that they're moving back 357 versions of the Xbox in the naming convention. Makes one wonder if they are still working on those other 357 prototypes. Fail early, fail often I guess.

Re:Grammar Junta, attack! (0)

Anonymous Coward | more than 5 years ago | (#26758185)

It should be called the Xbox1080 because games will finally support 1080 video the original Xbox promised.

Re:Grammar Junta, attack! (1)

Yetihehe (971185) | more than 5 years ago | (#26757819)

Didn't you get the memo? It will be called 2gether.

Re:Grammar Junta, attack! (1)

Phroggy (441) | more than 5 years ago | (#26757831)

Perhaps one could use Roman numerals: WiiII?

Re:Grammar Junta, attack! (1)

scorp1us (235526) | more than 5 years ago | (#26757849)

WiiAgain

Re:Grammar Junta, attack! (5, Funny)

dgatwood (11270) | more than 5 years ago | (#26757885)

After the Wii2 comes the Wii3... which comes in a special "R" edition with a digital video recorder, and also comes packaged with the ever popular game, Kings of Orient, making it the Wii3/Kings of Orient/R.

Re:Grammar Junta, attack! (0)

Anonymous Coward | more than 5 years ago | (#26757941)

>> Wii2
Sheesh - The proper term is WiiAlso.

And here I was thinking it would be "both of us".

Xbox3 and Wii2? (2, Interesting)

wjh31 (1372867) | more than 5 years ago | (#26757371)

are these confirmed names or assumed names?

Re:Xbox3 and Wii2? (5, Funny)

Anonymous Coward | more than 5 years ago | (#26757389)

Yes.

Re:Xbox3 and Wii2? (1)

harry666t (1062422) | more than 5 years ago | (#26757773)

Informative!? Funny I'd say.

Re:Xbox3 and Wii2? (1)

BikeHelmet (1437881) | more than 5 years ago | (#26757465)

They are not official names - but they indicate to all readers which consoles are being referred to.

Re:Xbox3 and Wii2? (1)

frission (676318) | more than 5 years ago | (#26757951)

I've heard WiiHD

Re:Xbox3 and Wii2? (1)

twitchingbug (701187) | more than 5 years ago | (#26757537)

Hrm. I'm not sure that Nintendo has ever released a console with a "2" moniker yet?

Re:Xbox3 and Wii2? (5, Interesting)

eln (21727) | more than 5 years ago | (#26757581)

The obvious follow-up to the Wii would be the Super Wii.

Re:Xbox3 and Wii2? (1)

frosty_tsm (933163) | more than 5 years ago | (#26757623)

The obvious follow-up to the Wii would be the Super Wii.

Is that what happens after you drank a lot of beer?

(I know I'm bringing back the jokes from 3 years ago, but I can't resist).

Re:Xbox3 and Wii2? (0)

Anonymous Coward | more than 5 years ago | (#26757813)

A console they didn't, but some of us are old to remember the pain in the a$$ that was Zelda 2...

Bad memories, bad memories I tell you...

Re:Xbox3 and Wii2? (1)

frankie (91710) | more than 5 years ago | (#26757723)

No. The official release names are:
  • Wiiii (although Wiiii is still in the running)
  • Microsoft Xbox Xtreme (also available in Media Center edition)

Ah yes! (1, Funny)

Anonymous Coward | more than 5 years ago | (#26757387)

The PC Gaming killer will be all Intel's doing.

Because when I think graphics, I think intel (4, Insightful)

scubamage (727538) | more than 5 years ago | (#26757393)

Seriously - about the only thing intel graphics offers is raytracing. Their graphics chipsets are notoriously subpar, even the very best of them. Why would sony send it their way? ATI makes sense for the Wii2 since they've been working with the gamecube platform since its inception... but intel? Can someone clear this up? Do they have some magically awesome chipset that has never graced the consumer market?

Re:Because when I think graphics, I think intel (4, Insightful)

Kneo24 (688412) | more than 5 years ago | (#26757455)

Which is precisely why I think this story is bullshit. No gaming machine, whether it be console or PC, will want an Intel GPU as it's workhorse for graphics. It just isn't possible. Not today. Probably not in the near future either. Unless, however, they plan on making the PS4 some super casual console that doesn't need a lot of oomph for their up and coming stick figure games.

Re:Because when I think graphics, I think intel (3, Insightful)

LordKaT (619540) | more than 5 years ago | (#26757523)

Unless, however, they plan on making the PS4 some super casual console that doesn't need a lot of oomph for their up and coming stick figure games.

Which wouldn't surprise me in the least, since Sony is more than willing to follow the pack leader to grab more marketshare and force their ill-conceived DRM laden formats on the masses.

Re:Because when I think graphics, I think intel (2, Informative)

grantek (979387) | more than 5 years ago | (#26757883)

Well the Cell was a bit out there when it was conceived, and Larrabee's sort of in that position now. I guess Sony is trying to take the bad press that came from the Cell being "too difficult to code for" and going with it, still maintaining that multicore is the way to scale up performance. Good on 'em, I say (despite my overall negative feelings toward the company).

Re:Because when I think graphics, I think intel (1)

berwiki (989827) | more than 5 years ago | (#26758179)

I want to know who at Sony keeps gambling with these bizzare chipsets, hoping they get uber-lucky one time, instead of going with something Tried and True.

Why do they keep doing this to themselves?
cell?! ftw?

Re:Because when I think graphics, I think intel (3, Interesting)

ByOhTek (1181381) | more than 5 years ago | (#26757765)

wouldn't that be a Wii^2?

Anyway, I think part of the reason Intels offerings suck, is they are going for the majority of the graphics market - integrated. They aren't really trying for powerhouse GPUs.

I'm not saying this is a sure fire thing, but as a rough indication, compare their primary competitor that does try for powerhouse GPUs and CPUs. Look at how they perform on the former with their competitors, and the latter with Intel.

If intel decides to make a performance GPU, it might actually work.

Add to that the fact that, using a static architecture, consoles don't /need/ the raw power of memory, CPU or GPU that a gaming computer needs, I think Intel has the potential to provide a very reasonable GPU for a console.

Re:Because when I think graphics, I think intel (2, Interesting)

berwiki (989827) | more than 5 years ago | (#26758073)

and you're going to let them 'beta test' their high-end GPU (with absolutely zero track-record) in your flagship gaming console?

Re:Because when I think graphics, I think intel (1)

Tanktalus (794810) | more than 5 years ago | (#26758229)

wouldn't that be a Wii^2?

So, what, the one after that would be a Wii Cube? Talk about re-using names...

Re:Because when I think graphics, I think intel (2, Insightful)

afidel (530433) | more than 5 years ago | (#26757771)

Larabee is for raytraced graphics which requires lots of processors more powerful than those found of todays GPU's and more complex interactions between functional units which are strengths Intel has. Beyond that Intel is all but guaranteed to come through this depression whereas the other two GPU houses are very questionable. Finally Intel has the best fab processes in the world so they can pack more units into a given die budget then anyone else.

It's that process size. (1)

tjstork (137384) | more than 5 years ago | (#26758133)

Intel can make a graphics chipset. They have the capital to do it, and the equipment. The best nVidia CPU uses a 65nm process. Intel already has a ton of 45nm out there and has successfully tests 35nm. They just don't want to pay for the R&D to do it. Blowing a ton of dough to build a graphics chip just not that big enough of a market for them, at least until Sony came along.

Now, they will partner with Sony, get extensive experience in graphics, and can leverage their own extensive design experience, fabs, and own rather good CPU offerings. It's really a super deal for Intel, although it is a sad day for Sony, which now out-sources something that they for a long time took great pride in building.

If Intel takes this new technology they come up with, and puts it into PCs, then yeah, nVidia is doomed.

Re:Because when I think graphics, I think intel (5, Informative)

Guspaz (556486) | more than 5 years ago | (#26757475)

Do they have some magically awesome chipset that has never graced the consumer market?

Yes, Larrabee [anandtech.com] . It's a massively-multicored x86 processor designed to act as a GPU (it has some fixed-function GPU stuff tacked on).

In effect, Intel intends to build a GPU powerful enough to get software rendering (and all the flexibility and power that brings) up to the same speed as hardware-accelerated rendering. Intel is also going to be providing OpenGL/Direct3D abstraction layers so that existing games can work.

Larrabee is expected to at least be competitive with nVidia/AMD's stuff, although it might not be until the second generation product before they're on equal footing.

Re:Because when I think graphics, I think intel (1, Interesting)

Midnight Thunder (17205) | more than 5 years ago | (#26757643)

es, Larrabee [anandtech.com]. It's a massively-multicored x86 processor designed to act as a GPU (it has some fixed-function GPU stuff tacked on).

So they are taking a general-purpose CPU and using is as GPU? Now why do I get the feeling that this is going to be suboptimal compared to a specialised processor? Also, I would hate to see the power consumption for the graphics capability got out of it.

Based on what Intel has released thus far, I will not believe it until I see it.

Re:Because when I think graphics, I think intel (1)

ByOhTek (1181381) | more than 5 years ago | (#26757785)

I don't think performance will be suboptimal, but my worries are parallel to yours on power.

One of the major reasons for using a specialized chip is that you don't waste energy on the GP stuff you don't need.

Re:Because when I think graphics, I think intel (3, Informative)

geekoid (135745) | more than 5 years ago | (#26757857)

No.
It's designed to be used as a GPU:
http://en.wikipedia.org/wiki/Larrabee_(GPU) [wikipedia.org]

looks really nice. I am looking forward to seeing what the finished product could do. The graphics market could use another competitor at the high consumer end.

Re:Because when I think graphics, I think intel (1)

Creepy (93888) | more than 5 years ago | (#26757957)

Sony is putting in a huge gamble here - first of all, without MS as a partner, they will need to develop their own graphics drivers or use whatever proprietary drivers Intel develops for them, which will make porting to other platforms a pain (if possible at all). MS is the only platform that has announced they are writing real time raytracing drivers (in the DX11 API due in March).

Raytracing requires high memory bandwidth because it needs to be scene aware - that means Sony will likely be using the newest, most expensive memory there is (probably DDR4 or GDDR5 if Larrabee itself holds the scene). It is unlikely Larrabee can compete with dedicated rasterizers at rasterization, so raytracing is its only major plus.

Developers will need to rewrite core libraries or purchase them. Want soft shadows? Buy it or re-develop in house because it isn't a default ray tracing feature and requires casting more (expensive) rays.

So far, Intel's demos have been at fairly low resolutions (like 512x512), and while that is theoretically scalable by processor, 1920x1080 (1080i) is nearly 8x that size.

I personally feel either Intel demonstrated something to Sony that the rest of us haven't seen or they pulled the sheet over on Sony by focusing on things ray tracing does well and avoiding things it does poorly (like showing lots of specular and avoiding diffuse lighting).

That said, most consoles don't need true CPUs, so it is possible that Larrabee will be used more as a hybrid of some sort, handling ray tracing as needed (and maybe AI or sprite movements) and some other hardware handling rasterization. I'm mostly speculating, and to be honest if you want Wii-like graphics with shiny perfect sphere heads, tapping Larrabee for all graphics would probably be adequate (realistic, no, but as the Wii has proven, realism is overrated).

Re:Because when I think graphics, I think intel (1)

urbanriot (924981) | more than 5 years ago | (#26757549)

What a lot of people fail to realize is that Intel GPU's are made with power consumption and heat generation in mind, not playing the latest and greatest 3D engine. If they oriented themselves towards pursuing high end 3D gaming, who knows what would happen?

Re:Because when I think graphics, I think intel (1)

jandrese (485) | more than 5 years ago | (#26757797)

My guess is that if they tried to go for the high end GPU market they would release a couple of powerful but flawed chips before they finally got it right. Unfortunately, flaws that get in a console can't be corrected until the next generation of the console 5-7 years later.

That said, console developers are accustomed to having to work around hardware flaws, sometimes quite severe, to get their games working. One thing seems certain: Sony is going to skimp on the memory again (for the fourth time) and give developers headaches.

Actually, asking Intel to design the console makes it sound like Sony has been drinking the Raytracing Kool-aid and thinks that maybe Intel could succeed where they failed with the PS3 and trying to create some sort of generalized software based rendering system.

Re:Because when I think graphics, I think intel (1)

UnknowingFool (672806) | more than 5 years ago | (#26757563)

One of the reasons mentioned in the article is that Sony views Intel as more financially stable than nVidia. Another reason is that there is no bad blood between Intel and Sony whereas there seems to be issues between Sony and nVidia. But I agree with your sentiment that technically, Intel has not shown any real prowess in this area.

Re:Because when I think graphics, I think intel (1)

Cutting_Crew (708624) | more than 5 years ago | (#26758041)

it seems to me nVidia is in fine shape. besides the PS3 currently uses OpenGL ES for its 3-D graphics and Nvidia would seem to be the perfect choice

Re:Because when I think graphics, I think intel (0, Redundant)

EdZ (755139) | more than 5 years ago | (#26757673)

about the only thing intel graphics offers is raytracing

Maybe Sony intent to pursue raytracing then?

Re:Because when I think graphics, I think intel (1)

scubamage (727538) | more than 5 years ago | (#26757713)

I was thinking that as I wrote it, but even intel said that they didn't believe raytracing was really viable until 8 core CPU's were the standard, more likely 16 core processors. Plus there seems to be very little backing in the games industry for raytracing as noted by John Carmack and others refuting that raytracing was the wave of the future. Though maybe that's the idea mentioned above with the Larrabee chipset?

Two words: Ray tracing (1)

should_be_linear (779431) | more than 5 years ago | (#26757827)

Problem is that console producers must already have at least broad idea what will next gen look like. We can assume in 5 years from now not only Intel, but also AMD and nVidia will have ray tracing cards. But Sony needs something *right now* that they can build their HW and SW around. Intel is currently only one with not only idea how to make ray tracing hw/sw engine but also working prototypes of Larabee they can provide to Sony soon. They might go with 2-chip solution: Larabee for graphics + normal 4-way Intel CPU for everything else. I would really like to see game engines moving to Java because with ray tracing there is really no need to stick with C/C++ any more. All general routines (AI, scripting, camera movements) are easy to do in Java and CPU doesn't care for graphics - you send whole scene to GPU for rendering, shadows and all is done in GPU. Also, Java is much more pleasant for doing multithreaded stuff.

Inquirer bullshit (5, Insightful)

Trepidity (597) | more than 5 years ago | (#26757405)

I know Slashdot isn't known for its high standards of editorial oversight, but do you really have to waste our time parroting Inquirer bullshit?

Note that the only media source claiming that Intel is designing the PS4 GPU is The Inquirer, which is wrong more often than they're right. And Sony has explicitly denied [techradar.com] the rumors.

Intel might end up designing the PS4 GPU, who knows. This story doesn't give us any more information than we had before either way, and the headline is certainly wrong in concluding otherwise.

Re:Inquirer bullshit (1)

fuzzyfuzzyfungus (1223518) | more than 5 years ago | (#26757433)

The Inquirer isn't all nonsense. They did some good, and amusing, work on the Nvidia GPUs dying horribly a lot story.

on the merits, also unlikely (2)

Trepidity (597) | more than 5 years ago | (#26757485)

For what it's worth, on the merits this is also unlikely to be a genuine leak, even if we ignore it being from the Inquirer. They claim that they got this scoop from a "nice Sony engineering lady at CES". It's unlikely that a random Sony representative at CES would even be privy to such information, or it would've leaked by now. These sorts of decisions are generally kept pretty close to the chest, and don't leak very early unless someone fucks up. (When they do leak, it can lead to SEC investigations.)

Re:on the merits, also unlikely (1)

nschubach (922175) | more than 5 years ago | (#26758149)

Yeah, I read that and thought... if it were a "nice Sony engineering guy" trying to hit on the "hot lady reporter" offering her a scoop for some... ahem ... services, it might make more sense.

Re:Inquirer bullshit (0)

Anonymous Coward | more than 5 years ago | (#26757487)

I did a search on the guy who posted this crap - Charlie Demerjian - and found a listing of his other fine, fine work:

http://www.crunchbase.com/bloggerboard/tech/author/charlie-demerjian [crunchbase.com]

It's mostly sensationalist FUD and hearsay. Seriously, how does shit like this get posted? Especially from morons like this guy?

Wow, editorialize much (0)

Anonymous Coward | more than 5 years ago | (#26757439)

Nothing like a long-winded op-ed article passing for "news" to make my friday a little gloomier. Ah well, at least the DOW is up.

Slow down and consider the implications (3, Interesting)

Bruce Perens (3872) | more than 5 years ago | (#26757457)

Remember that Intel has been willing to open its drivers and specifications. Any improvements they make to their graphics core (which, yes, is in need of improvement) will probably make their way to motherboard and laptop chipsets, and will have nice Open Source Linux drivers made for them. So, this is a good thing.

Re:Slow down and consider the implications (0)

raddan (519638) | more than 5 years ago | (#26757803)

Remember that Intel has been willing to open its drivers and specifications.

Really? Maybe my information is out of date, but that's not how I remember it [kerneltrap.org] . Are they doing something different now?

Re:Slow down and consider the implications (3, Funny)

Bruce Perens (3872) | more than 5 years ago | (#26758195)

Yes. The article you cited is from 2004. Good morning, Rip van Winkle! There have been a lot of changes. :-)

What the hell? Intel? (1)

Gizzmonic (412910) | more than 5 years ago | (#26757467)

They must have seriously low-balled NVidia. NVidia has the better tech. Intel is traveling into unknown territory here, and Sony is risking its reputation on the lowest bidder. Playstation 3 already is a so-what console due to its bizarre architecture. I personally love it, but I know why people have shied away from it. Sony can't afford another mediocre offering...and they are taking an awfully big risk with Intel.

Re:What the hell? Intel? (1)

geekoid (135745) | more than 5 years ago | (#26757909)

No, and not true.

NVidia has tried tech, but it may not be better then:
http://en.wikipedia.org/wiki/Larrabee_(GPU) [wikipedia.org]

Intel understands Chips very well, and that includes GPUs.
There are two places they could fail.
1) They don't fully utilize Larrabee abilities in the drivers..unlikely.
2) They get eaten alive by the Graphics marketing.

another possibility is that it gets pulled due to economic downturn. I consider that highly unlike due to where they are at in development, and there momentum.

The Cell architecture isn't catching on among developers in the game industry. Odd, really. OTOH, most of them are hacks.

Larrabee doesn't exist, yet (2, Insightful)

Trepidity (597) | more than 5 years ago | (#26758197)

It's kind of weird to take Larrabee as evidence of Intel having successfully produced a GPU, since they still haven't produced it, despite years of hype. It might turn out to be as excellent as they claim. It might turn out to be as excellent as revolutionary as their last revolutionary new architecture, Itanium.

Cell? (3, Interesting)

Major Blud (789630) | more than 5 years ago | (#26757469)

So after all the smack talking that Sony did about the power of Cell being untapped....they've decided to abandon it for the their next console? But if you listen to Sony, PS3 is a "10 year platform" anyways. This means that we wouldn't see the PS4 until at least 5-6 years from now. There is no telling what kind of processors would be available during that time frame. Do we really know if Larrabee would still be available by then? I think it's still way to early for Sony to start talking about specs for the PS4. Some people are bound to stop buying PS3s because it would just mean that the PS4 is right around the corner, and Sony knows this. They really can't afford to halt sales of the PS3 at their current selling rate.

Re:Cell? (1)

Bruce Perens (3872) | more than 5 years ago | (#26757527)

Well, the power of cell being untapped was a problem, wasn't it. Parallel programming is difficult.

Re:Cell? (1)

twitchingbug (701187) | more than 5 years ago | (#26757585)

Yup. I have to laugh at all those prognosticators who thought that Cell would take over the world. Parallel programming is hard.

Re:Cell? (1)

robmv (855035) | more than 5 years ago | (#26758043)

The same was told about writing multithreaded programs, but you know that is the standard now right?

Re:Cell? (1)

Creepy (93888) | more than 5 years ago | (#26758089)

Parallel programming isn't that hard if you design for it (in fact, I quite like working with threads - I keep a pool for concurrent file loading while the program is executing in my own code), but for the most part, games don't need it since they are usually throttled more by the GPU or memory bandwidth than the CPU. Most of the parallelism needed is in the GPU these days (in the form of programmable shaders).

Re:Cell? (0)

Anonymous Coward | more than 5 years ago | (#26757545)

The PS3's GPU was designed by nVidia. Cell is its CPU.

Re:Cell? (1)

Major Blud (789630) | more than 5 years ago | (#26757601)

Isn't Larrabee a CPU/GPU hybrid architecture and not a dedicated GPU?

Re:Cell? (1)

robmv (855035) | more than 5 years ago | (#26757931)

PS3 GPU -> NVIDIA
PS4 GPU -> Intel based on speculations

The Cell processor is not the GPU of the PS3

Re:Cell? (1)

YesIAmAScript (886271) | more than 5 years ago | (#26757947)

Cell is being tapped right now. Have you seen Motorstorm 2? Killzone 2? The Cell is finally being properly programmed and it looks great. It finally is doing things Xbox 360 can't do (about time!). It definitely will be tapped out by the time PS4 comes out.

You don't understand the 10 year platform thing. It doesn't mean 10 years between consoles, it means that there are two active consoles at a time. Like PS1 overlapped with PS2 for 4 years. Like PS2 right now has game releases weekly.

Sony isn't talking specs for PS4, the Register is. However, internally Sony has to have a PS4 ready very soon. The biggest advantage Xbox 360 had this generation was it came out a year earlier. Sony cannot afford to be caught behind this year, they need to have something ready for a year from now in case MS announces right now. They might not use it, if MS holds off, but they have to have it ready.

Re:Cell? (1)

nschubach (922175) | more than 5 years ago | (#26758225)

I have more of a feeling that this is more FUD spread by someone on the anti-Playstation camp. Just like all the rumors on price drops and such. If you spread news that Sony is abandoning the current gen system for a whole new platform, why buy one?

Sony wouldn't announce that. It smells like some kind of marketing scam.

Economics? (0)

Anonymous Coward | more than 5 years ago | (#26757483)

"Due to current economic times it was a no brainer for Sony to go with Intel"
I'm no rocket scientist, but I fail to see how choosing Intel to create a brand new GPU with their fledgling GPU experiences is possibly the best course of action if a company wants to reduce risk. I guess by "no brainer" they mean a decision that uses "no brains"

Re:Economics? (1)

sunking2 (521698) | more than 5 years ago | (#26757755)

I think they are trying to imply that Intel has the greatest chance of still being around and all the others can be viewed as vulnerable. Signing on with a company that may not be in business is probably not a good idea. Depending on your doom and gloom meter this may make some sense.

Re:Economics? (1)

afidel (530433) | more than 5 years ago | (#26757877)

AMD/ATI and NVidia are both questionable to survive the downturn if it is protracted like many are currently speculating. If you were going to start putting money into designing a next generation console today who would YOU pick to design a core component? Remember that if your first GPU choice folds you have not only lost a significant amount of time in the GPU stage but you probably have to go back to the drawing board with the rest of the system.

Bizarre metaphor (0)

Anonymous Coward | more than 5 years ago | (#26757511)

if history is any judge

WTF? History isn't a judge. Sometimes it's a guide, but it isn't a judge. It's a bunch of stuff that already happened. How could it judge you?

Re:Bizarre metaphor (3, Insightful)

Gizzmonic (412910) | more than 5 years ago | (#26757597)

History: I put on my robe and judge's wig.

Re:Bizarre metaphor (1)

gEvil (beta) (945888) | more than 5 years ago | (#26757715)

History: I put on my robe and judge's wig.

I really don't want to know where you're going to put that gavel.

A no brainer?? (0)

Anonymous Coward | more than 5 years ago | (#26757519)

"Due to current economic times it was a no brainer for Sony to go with Intel."

This makes no sense to me. Presumably they are implying that Intel is more likely to be around since they're a larger company that is doing better than AMD/ATI. I think they'll remain around, but even if they don't, where do you think Nvidia is going? Intel is currently not even competing when it comes to mid/high-range graphics. Do they see that market disappearing (hence the failure of Nvidia) at the EXACT same time Larrabee becomes more than just buzz? Dream on.

Wow... (1)

sesshomaru (173381) | more than 5 years ago | (#26757589)

People still think we are going to have a Wii2, Playstation4, and Xbox720 in our post apocalyptic nightmare landscape? (Hey, I admire the optimism!)

Well, maybe Tenpenny will, since he always has the best of everything.

Re:Wow... (0)

geekoid (135745) | more than 5 years ago | (#26757981)

what are you talking about?

Eww new console news already. How nice. (1)

Cathoderoytube (1088737) | more than 5 years ago | (#26757593)

It'll be interesting to see what the new consoles come up with (I have a hunch it'll entail more hard drive space, better graphics and processors) If the console makers actually use full keyboard and mouse support for their new consoles they could do a lot of damage to the pc gaming market. I mean A LOT. Really, I mean it.

Re:Eww new console news already. How nice. (1)

Zakabog (603757) | more than 5 years ago | (#26757823)

If the console makers actually use full keyboard and mouse support for their new consoles they could do a lot of damage to the pc gaming market. I mean A LOT. Really, I mean it.

You do realize that a lot more software is sold for consoles than PCs right? The console gaming market is huge, mostly because you just buy a console and any game out for that console will work out of the box. I haven't bought a PC game in a long time, but I'm always buying new games for my PS3 or Wii. Plus I'm the only person I know that has a computer capable of playing the latest games (most people I know have a cheap dell with integrated video that couldn't handle something as simple and old as Quake III.)

So how is this related to Australia? (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26757635)

I RTFA and saw no connection with Australia or Australians.

Isn't there a rule now that all Slashdot stories must be about Australia, or at least have a clear connection with that country?

If kdawson was here, he would never have allowed this story anywhere near the front page without somehow connecting it with his beloved Australia.

Bring back kdawson!

Xbox3 (0, Troll)

Doug52392 (1094585) | more than 5 years ago | (#26757671)

Featuring 3 BRAND NEW error colors! The Red Ring is so previous-gen, now, light up your Xbox3 with Orange, Pink, and Green Rings of Death!

Choose your edition of the Xbox3 Today:
Xbox3(r) Starter Edition(tm)
Xbox3(r) Basic Edition(tm)
Xbox3(r) Premium Edition(tm)
Xbox3(r) Gamer Edition(tm)
Xbox3(r) Uber Gamer Edition(tm)
Xbox3(r) l33t h4x0r Edition(tm)
Xbox3(r) Legendary Edition(tm)
Xbox3(r) Collector's Edition Edition(tm)

Starting at just $599.99!

Re:Xbox3 (1)

scubamage (727538) | more than 5 years ago | (#26757809)

Silly parent, your starting price completely ignores the soon-to-be-arriving hyperinflation! My guess is it'll be more around the price range of 4-5000$ intro.

Re:Xbox3 (0)

Anonymous Coward | more than 5 years ago | (#26758189)

Hyperinflation? Ha! We're probably going to enter the first period of deflation in the US in a lifetime.

Larrabee as Console GPU (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26757687)

Gotta post AC, as I'm on too many NDAs :-(

One distinct advantage Larrabee brings to the console is that it can be GPU and CPU -- they could make a 2 chip console -- Larrabee on one, and IO on the other. That would keep costs under control. They would probably use Larrabee chips where 1 to 4 cores are failing tests -- the console version would advertise 28 cores then. They'd probably also under clock it a bit to keep it cooler.

It would give them huge economies of scale, and get people to write software for it. And it will likely be competitive (at least in raw number crunching/bandwidth) to NVDA or ATI offerings at the same time. The devil is hiding in the details of their software stack. But they do have some good guys working on that. Abrash, for one.

-- A.C.

Why Intel? Because IBM screwed Sony... (4, Informative)

NullProg (70833) | more than 5 years ago | (#26757705)

From the article:
How Sony inadvertently helped a competitor and lost position in the videogame market.

Read here: http://online.wsj.com/article/SB123069467545545011.html [wsj.com]

Enjoy,

What A Moron (-1, Flamebait)

Anonymous Coward | more than 5 years ago | (#26758117)

I can't believe you actually were stupid enough to post that bullshit. What a fucking moron.

What the fuck is it about console news that causing idiots like to pretend like there is some award for being the biggest retard.

Re:Why Intel? Because IBM screwed Sony... (2, Informative)

weffew... (954080) | more than 5 years ago | (#26758209)

The point of that WSJ piece is that the Xbox360 CPU and the PS3 CPU are the same because they both come from IBM?

It's bull. The xbox360 CPU is a totally different architecture of PowerPC. I'm amazed you posted that.

C

CPU? (1)

neostorm (462848) | more than 5 years ago | (#26757789)

"...also take part in the CPU design of the console"

Wait... Why did Sony spend all that time, money and research on making their assumed super-scaling, awesomely powerful cell processor, if they're thinking of recreating a new CPU for their next console? Am I missing something there?

Re:CPU? (0)

Anonymous Coward | more than 5 years ago | (#26757945)

Perhaps they feel that even "awesomely powerful" processors start to show their age after 10 years.

x86 again? (1)

Sybert42 (1309493) | more than 5 years ago | (#26757953)

Larrabee is just a bunch of gross x86 cores (maybe it's one of those grids). I'm sick of looking at gross stack traces where the instruction pointer can be all over the place.

Also, didn't Intel abandon the one chipset hardware RNG they had? Even Via has one now. The singularity can't come soon enough.

Re:x86 again? (1)

geekoid (135745) | more than 5 years ago | (#26758015)

The singularity will be run on x86.

great news, the more cores the more singularity!

Shut the hell up (0)

Anonymous Coward | more than 5 years ago | (#26757987)

PS4? The PS3 is barely beginning to become competitive and live up to its name. It has 5 more years of life. At least, 5 more years before I will drop another $800 to replace it! Sony needs to shut the fucking hell up about the NEXT system, and focus on the current system, or they will lose their remaining customers!

Lucky for nVidia (2, Insightful)

argent (18001) | more than 5 years ago | (#26758035)

Gee, over the past few years the news has been all about how doing a GPU for a console not only lost you money, but also pulled resources away from the profitable PC market, and the last few exchanges between ATI and nVidia holding first place in that market have been attributed to this.

Intel needs any kind of GPU win, badly, and they're big enough and rich enough they can afford to lose money on each chip and make it up on volume.

It's Sony I worry about, given how utterly appalling Intel GPUs have traditionally been.

So I gotta wonder, why do you think nVidia is worried about this?

Good thing (0, Offtopic)

Phroggy (441) | more than 5 years ago | (#26758059)

I want Intel to get more experience designing GPUs that are competitive with nVidia and ATi. I want Intel to start offering PCIe graphics cards. Why?

Because Intel is committed to releasing open-source drivers. I want to support any company that does this, and currently if I'm in the market for a graphics card, I don't have the option at the moment.

backward compatibility (1)

FunkyELF (609131) | more than 5 years ago | (#26758243)

Just a guess. But it looks like backward compatibility just got hosed if they go to all Intel. There was such a gap in power that the PS3 was able to emulate the PS2 games on a completely different arch. I'm guessing that emulating the Cell Broadband Processor will be much harder if possible at all on Larrabee. This might be the final nail in Sony's coffin after recently having a 95% drop in profits. I have a PS3, and a 60" Sony SXRD display but damn do I hate Sony. It is interesting that they went with Larrabee for a GPU which is very similar to the CBE in the first place. Having both would be silly. They could have 2 CBE's or 2 Larrabee's.

The PS4 will blow away ANY netbook in graphics! (1)

malevolentjelly (1057140) | more than 5 years ago | (#26758245)

Intel is going design their GPU... that's nice. I suppose they're aiming to find a way to waste more money and bomb harder than the PS3... if Sony can outdo themselves this time, they'll never have to make another console again!

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?