Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

ARM Claims PS3-Like Graphics On Upcoming Mobile GPU

timothy posted more than 2 years ago | from the vapor-tastes-like-raspberry-foam dept.

Graphics 217

l_bratch writes, quoting from the BBC, "'British computer chip designer ARM has unveiled its latest graphics processing unit (GPU) for mobile devices. The Mali-T658 offers up to ten times the performance of its predecessor." ARM claims that its latest GPU, which will be ready in around two years, will have graphics performance akin to the PlayStation 3. If this has acceptable power consumption for a mobile device, could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"

Sorry! There are no comments related to the filter you selected.

In two years (2, Insightful)

starmonkey (2486412) | more than 2 years ago | (#38012690)

In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

Re:In two years (4, Insightful)

zero.kalvin (1231372) | more than 2 years ago | (#38012762)

It doesn't matter if in two years we would be experiencing better graphics, just imagine playing a PS3 like graphics on something that barely consume 1W ( or dunno how much a mobile device should), and I would bet it wouldn't cost that much either.

Re:In two years (5, Insightful)

Anonymous Coward | more than 2 years ago | (#38013218)

Except there is NO WAY it can be done at 1w even at the best rate of computing improvements. Remember, they did not mention power usage in their press release, only the submitted did. While they are taking power into consideration, it seems to me more of scale in where idle usage is extremely low with the cores shut down. This is great news for moble devices that don't expect full usage most of the time (assuming the scale is extreme to where idle is extremely low power usage).

Remember, Arm has been slowly scaling up in speed while x86 scaling down on power usage. It wouldn't be surprising if this new gpu uses more power then traditionally known for arm. That said, alot remains to be seen. Press release and actual performance can be worlds apart. How many times have a company promised something-like performance only for it to not deliver. Hopefully, it's true though.

Re:In two years (1)

ArcherB (796902) | more than 2 years ago | (#38013240)

It doesn't matter if in two years we would be experiencing better graphics, just imagine playing a PS3 like graphics on something that barely consume 1W ( or dunno how much a mobile device should), and I would bet it wouldn't cost that much either.

I still believe that PS3 graphics will be severely dated in two years and is probably dated now. However, if this chip is truly low power and cool running, why not put 10+ of them on a single card?

Re:In two years (3, Insightful)

somersault (912633) | more than 2 years ago | (#38013280)

Because we're not talking about graphics cards, we're talking about single chips for using in phones, etc, where compactness and power usage are very important?

GPU Farms? (2)

AdamJS (2466928) | more than 2 years ago | (#38013684)

I don't know much about ARM GPUs, but if these turn out to be significantly lower-powered than their counterparts, couldn't multi-gpu ARM boards be put to great use for GPGPU applications?

Re:In two years (0)

Anonymous Coward | more than 2 years ago | (#38013746)

No, we ARE talking about using mobile hardware in desktop platforms. The reason we are is that slashdot editors don't think about what they're writing. Instead of writing about an interesting article with an accurate synopsis, they've perverted it into their own idiot take on where a new chip might be heading. In this case it means postulating that outdated hardware could be used in a less than ideal application. Because slashdot is full of fucking munter idiots.

Re:In two years (1)

TheRaven64 (641858) | more than 2 years ago | (#38013682)

I think you misunderstand ARM's market. ARM is not in the desktop market, or even in the laptop market except at the low end. They do, however, completely own the embedded market right up to the top end of the smartphone and table markets. This kind of core will end up in smartphones and tablets. You will be able to run PS2-era graphics on something that fits in your pocket and work from batteries (and probably has Thunderbolt or HDMI output for connecting it up to a big screen). It isn't competing with some energy-guzzler on the desktop, it's reducing the need for such a thing to exist at all. PS3 graphics aren't as good as they can possibly be, but they're a lot better than is required for a host of applications, and now you can do all of those with a pocket-sized device.

Re:In two years (4, Insightful)

tepples (727027) | more than 2 years ago | (#38012802)

In two years, PS3-like graphics will be insufficient

Counterexample: FarmVille.

Re:In two years (5, Insightful)

trum4n (982031) | more than 2 years ago | (#38012932)

Better Counterexample: Minecraft!

Re:In two years (1)

JoeMerchant (803320) | more than 2 years ago | (#38012956)

500,000,000 downloads of Angry Birds?

Re:In two years (0)

Anonymous Coward | more than 2 years ago | (#38013182)

Two words and an acronym: Angry Birds 3D.

Re:In two years (0)

Anonymous Coward | more than 2 years ago | (#38013664)

Ohh! Can I sign up to play the beta version of Angry Birds 3D? I'll trade in my iPhone for a side-load capable Android phone if it will get me that 3D pig smashing glory sooner!

Re:In two years (2, Informative)

ifrag (984323) | more than 2 years ago | (#38013552)

Although in Minecraft, you can get some high res textures that make the game look a little more modern, and there are also modded shaders which can do some neat stuff as well. Even stuff like bump mapping.

I was playing with the default 16x16 for a long time, but I've finally got a little sick of it and made the switch up to 32x32.

Re:In two years (1)

jellomizer (103300) | more than 2 years ago | (#38012894)

Well it is the point that the PS/3 is old in computer terms. While this is an advancement for mobile computing It still fits in the fact in terms of performance mobile computing is about 5-10 years behind desktop computing.

Re:In two years (0)

Anonymous Coward | more than 2 years ago | (#38013216)

The GPU in the PS3 was old before it was released but then GPUs have plateaued.
Other than tessellation and other 3D-like gimmicks nothing much has been added to GPUs lately.
Hollywood has forced 1080p on us and even middle of the road GPUs can do that. Sure you can
get 30" 2560x1600 monitors and Eyefinity for the high-end market but this means that in a few years
every level of gaming GPU will be capable of 3D graphics at 1080p.

It means that writing games will be cheaper... make a mobile app and port it to consoles & PCs.. Yay!
Just as consoles destroyed PC gaming, mobile gaming will destroy console gaming.

Re:In two years (0)

Anonymous Coward | more than 2 years ago | (#38013572)

1080p on middle of the road? Mobile GPU's can do it already.

The problem is that software doesn't take advantage of GPU's in significant ways. We are perpetually stuck catering to the weakest GPU, hence the stall out with DX11 parts but games only having DX9 rendering paths because of Intel GMA shit in laptops, and DX9 parts in consoles.

Microsoft and Nintendo need to step up to the plate and release the next half point generation (Nintendo has been doing tick-tock with their portables
GB-GBC,GBA-GBA (clamshell)-GBA(brighter screen),DS-DS Lite-DSi-DSiXL, 3DS) The Wii was really more of a half upgrade to the gamecube, but the Wii U will be the full generation update. Microsoft released the Xbox360 so many times now because of hardware inadequacies, and with the current model (XBOX 360 S + Kinect) have currently reached it's half point generation without much improvement. The next model needs to match or leapfrog the Wii U, but because GPU power has only marginally increased since then, most of the improvement has to come from an increase in CPU cores (12 cores?)

The PS3, unfortunately is a dead end. Whatever version comes out next will lack all backwards compatibility where as I don't see Nintendo or Microsoft Jettisoning their compatibility. Without the Linux/OtherOS support there is nothing the PS3.5 or PS4 could offer. The PS3 is only "better ever so slightly" than the Xbox360 because of the BD player, like the PS2's DVD player. Allowing for larger discs. I bought an Xbox360 because more of the exclusive games in NA are on the Xbox, where as if I were in JP I'd have bought the PS3 for similar reasons. Sony and Nintendo could just kiss and makeup, put PS3/PS2/PS1 dynamic recompilation compatibility in the Wii U and make it work with the PS3 controllers. In theory all the current generation consoles CPU's are compatible with each other but the PS3's CPU would require more cores in the Wii U to emulate, or emulate it on the GPU.

My hope is that the next consoles are 8 or 12 core systems, and developers arms get a bit twisted in having to thread games properly. Whatever comes out Next should be able to beat the pants off a 8 core Core i7 with a Radeon HD 7xxx series GPU, but I think we're really only going to see quadcore C2D CPU with Radeon HD 2000 performance. It would be nice for once if a game console could "also be used as a desktop" much like mobile phones can "also be used to play games" because that would shoot a hole in the need for desktop PC's entirely.

Anyhow with 64bit quad core ARM+GPU(PS3 capable) that means that the next generation of mobile devices (think iPad 4) could negate the need for game consoles.... provided that the iPad 4 provides compatibility with PS3/Xbox360/Wii joystick devices. Until then, game controllers with a television is the status quo, and tablet pc's will not replace them because they lack tactile feedback. Or as the theregister refers to them, fondleslab.

Mobile has less power (3, Insightful)

bill_mcgonigle (4333) | more than 2 years ago | (#38013356)

mobile computing is about 5-10 years behind desktop computing

And it always will be, unless somebody devises as way to provide 15A of power to a mobile device, and a way to dissipate that sort of heat.

Now, we may eventually reach a state where it just doesn't matter - everybody will have enough computing power on their phone to raytrace a 4K HD stream in realtime and they will reach a natural equilibrium where it just doesn't make sense to make faster chips for desktop computers. Or, we might see such great Internet pervasiveness that everybody just has thin-clients and computes on a CPU farm, but until either of those things happen, desktops will be faster than mobile devices.

Re:In two years (5, Interesting)

poetmatt (793785) | more than 2 years ago | (#38013010)

umm, look at the tegra 3. ARM graphics are catching up to consoles quite easily (consoles were always behind). Remember, it's been 3 years where we went from "ARM can barely handle nintendo emulation (single core/500mhz/125mhz gpu)" to "ARM is competing with PS3(4 cores, 1.5ghz, 300+mhz multicore gpu)". In *3* years. All with devices that are more efficient with power than anything intel can offer. So what do you see for the next 12 months, let alone 3-4 years? Even if the increases slow down they're basically going to make x86 processors irrelevant.

Emulation isn't necessarily a fair comparison (3, Insightful)

tepples (727027) | more than 2 years ago | (#38013088)

we went from "ARM can barely handle nintendo emulation (single core/500mhz/125mhz gpu)" to "ARM is competing with PS3(4 cores, 1.5ghz, 300+mhz multicore gpu)". In *3* years.

Are you comparing emulating an NES to running native games? An emulator has to contend with the entire game engine being written in bytecode, and it has to process graphics a scanline at a time so that games' raster effects (parallax scrolling, fixed position status bars, Pole Position/Street Fighter/NBA Jam floor warping, etc.) still work. A native game running on a 3D GPU doesn't need the bytecode interpreter overhead, and it can render one object at a time because it doesn't need raster effects.

Re:Emulation isn't necessarily a fair comparison (0)

Anonymous Coward | more than 2 years ago | (#38013328)

probably talking abot n64. xperia play plays those comfortably. and modern smartphones have more ram than ps3

Re:Emulation isn't necessarily a fair comparison (1)

Hatta (162192) | more than 2 years ago | (#38013634)

Compare emulating an NES on a handheld to emulating it on a PC. FCEU runs well on a 200mhz Pentium. Shouldn't a 500mhz ARM do better?

Re:Emulation isn't necessarily a fair comparison (0)

poetmatt (793785) | more than 2 years ago | (#38013742)

well, I suppose I merged two issues, but my point is that very soon it's likely that ARM will surpass current and future console capability, as well as x86 desktop and server capability, power, and performance per watt. It's not far off the horizon.

Re:In two years (0)

Anonymous Coward | more than 2 years ago | (#38013846)

Emulation is a poor example. You need like a 3.0ghz processor to even emulate the snes fully.

http://arstechnica.com/gaming/news/2011/08/accuracy-takes-power-one-mans-3ghz-quest-to-build-a-perfect-snes-emulator.ars

Re:In two years (1)

Korvin20111803 (2019784) | more than 2 years ago | (#38013148)

In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

It is insufficient for serveral years already. Consoles were not improved for 6 years, and constanly developing desktop keeps getting console graphics (excluding only Crysis 1), except better resoultion, framerates and minor improvements if any.

Re:In two years (1)

bberens (965711) | more than 2 years ago | (#38013564)

I kind of hope for more stagnation in the graphics quality market. Let's just hang out where we are for a while and hopefully the game makers will start competing on interesting story lines, game mechanics, etc. rather than ripples in water in puddles.

Re:In two years (1)

geekoid (135745) | more than 2 years ago | (#38013204)

Its' for the mobile market. So, MW3 on your phone.

Re:In two years (1)

ifrag (984323) | more than 2 years ago | (#38013638)

From the reviews I've read, running MW3 isn't really that much of a technological achievement. Maybe if it was Battlefield 3...

Re:In two years (2)

nschubach (922175) | more than 2 years ago | (#38013886)

Pfft, Mechwarrior 3 wouldn't be a problem. ;)

$30 Video Game System (2)

bill_mcgonigle (4333) | more than 2 years ago | (#38013296)

In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

Never underestimate the low-end. Imagine a dongle with an HDMI plug on one end that just plugs into a TV set, but inside it has a chip that can do PS3-level graphics, WiFi for downloading games, Bluetooth for controllers, and enough flash to cache them.

Most HDMI ports can provide 150mA at 5V, which is minimal for this sort of application, but within sight in the next several years.

Re:In two years (1)

kelemvor4 (1980226) | more than 2 years ago | (#38013618)

In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

PS3 graphics are a bit dated already. Consoles (and console ports) are seriously limiting the graphics in current run games. It's a pity, really. Good that cell phones will have circa 2006 GPU capabilities soon, though.

Re:In two years maybe Intel will catch-up (-1)

Anonymous Coward | more than 2 years ago | (#38014068)

They must think we are idiots. Intel's i7 has yet to exceed the performance of the Cell Processor and Cell is a Ten year old design. The NeoCons (Royals) keep pushing/promoting their family processor (ARM), so the Proles will be hobbled with inferior encryption.

Resolution! (5, Insightful)

jonnythan (79727) | more than 2 years ago | (#38012698)

Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

Re:Resolution! (1)

GodfatherofSoul (174979) | more than 2 years ago | (#38012780)

I think you're looking at the wrong side of the street. This isn't about the top-end computing power; it's about the efficiencies on the bottom end. So, now you can start churning out laptops and cheap PCs with pedestrian graphics cards that use low power and provide significant performance. No need to take the truck nuts off your Dell, sir.

Re:Resolution! (1)

jonnythan (79727) | more than 2 years ago | (#38012882)

I was addressing the question at the end:

"could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"

Wii and Wii U (1)

tepples (727027) | more than 2 years ago | (#38013166)

could we be seeing ultra-low power hardware in high-end PCs and consoles soon?

I thought that was the entire point of the Wii. Because the "sensor bar" (IR position reference emitter banks) needed to sit by the TV, the console's case needed to be small. This meant Nintendo couldn't use a CPU and GPU with a high TDP, so it stuck with what is essentially a 50% overclocked GameCube. I guess Nintendo is trying the same tactic with the Wii U: take a roughly Xbox 360-class CPU and GPU and take advantage of six years of process shrinks to get the TDP down so it'll fit in the same size case.

Re:Wii and Wii U (0)

joocemann (1273720) | more than 2 years ago | (#38013282)

Be careful what you say. Fanboys with buyers remorese will defend the crap out of that console they hardly touched.

Re:Resolution! (1)

hairyfeet (841228) | more than 2 years ago | (#38013566)

Sure we will at least on the PCs although it won't cost a lot and it'll be from AMD and Intel, like with Brazos and Ivy Bridge CULV. I mean when I can pick up a netbook for $350 with 8Gb of RAM that will play L4D and TF2, get 6 hours on a battery while watching HD movies, outputs 1080p over HDMI, and all in a machine that only weighs 3 pounds and costs less than my craptastic Celeron laptop did 5 years ago? Now THAT is nice!

I think the next advance will be just how far what was once considered "gamer only" levels of graphic performance will spread. i mean it wasn't that long ago that if you wanted to play anything more than DVDs or better than SD you needed a discrete card in your laptop that jumped the price like crazy and made the battery life shit, or how in desktops IGP was a dirty word and how cell phones had to be dropped to almost comically bad resolutions just to get more than a slideshow. Now you have all these machines either in the market or coming down the pipe that get frankly insane levels of graphics for prices so cheap anybody can have one.

Personally i'm ALL for it. I don't know about the ARM side but on the X86 it looks like OpenCL (which is now supported by Nvidia as well as AMD) is gonna make more and more programs accelerated by the GPU, the power seems to be dropping both on the desktop side and on the mobile side its getting crazy how much performance per watt some of these things are getting, and its all gonna be smooth video and nice if not truly Crysis insane levels on the gaming front. If ARM can keep current power usage levels and get PS3 graphics? I say great, more avenues to sell them means more games!

Re:Resolution! (3, Informative)

Anonymous Coward | more than 2 years ago | (#38012868)

Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

Except most phones released today have 1080p output via hdmi. So now what?

Re:Resolution! (1)

jonnythan (79727) | more than 2 years ago | (#38012908)

And do you know of any phones that allow you to play games at 1080 using the HDMI output? No?

Neither do i.

Re:Resolution! (5, Informative)

aristotle-dude (626586) | more than 2 years ago | (#38013020)

And do you know of any phones that allow you to play games at 1080 using the HDMI output? No?

Neither do i.

The iPhone 4S and its bigger brother, the iPad 2 tablet.

Re:Resolution! (4, Interesting)

Sockatume (732728) | more than 2 years ago | (#38013028)

The Galaxy Nexus' built-in display is 720p. (That it's Pentile is irrelevant to this issue.) If it follows a similar arc to the original Nexus those screens will be showing up in low-end phoens within a couple of years.

Re:Resolution! (1)

tepples (727027) | more than 2 years ago | (#38013194)

That it's Pentile is irrelevant to this issue.

I disagree. PenTile means games can skimp on the antialiasing.

Re:Resolution! (1)

Sockatume (732728) | more than 2 years ago | (#38013300)

PenTile performs sub-pixel rendering by necessity, but that's just approximating the image that would be created by an RGB display. It's not going to do anything for aliasing artifacts.

Re:Resolution! (0)

Anonymous Coward | more than 2 years ago | (#38013786)

When the pixels are small enough, there's less and less need for antialiasing. 96 DPI is just little. Very little.

Re:Resolution! (0)

Anonymous Coward | more than 2 years ago | (#38013068)

Well....
The chip in the Rouk2 was designed for smart phones and tablets, it will play Angry Birds on your 1080p TV.
There's no reason that a phone with that chip in it couldn't play games on your 1080p TV as well.
It's also the same chip in the RaspberryPi.

Re:Resolution! (1)

GeLeTo (527660) | more than 2 years ago | (#38012880)

This chip will work on tablets, so I am not sure about the 20-40% of the pixels thing. Also PS3 games look much better than the PC counterparts on similar hardware thanks to the fine-tuning and specific optimisations which are possible only on fixed hardware. In order to match the PS3 the Mali GPU actually has to be more powerfull. And let's not forget that the power consumption will be orders of magnitude lower. It definatelly will not be high-end, but might still be more powerfull than most of the GPU's that will be sold at the time of the release (e.g. integrated Intel crap).

Re:Resolution! (1)

Oswald McWeany (2428506) | more than 2 years ago | (#38012984)

There will still be some hard-core graphics intensive games that will require whatever the cutting edge in graphics is at that point.

However, as old as PS3 may be- the fact is, that, for most of us non-hard-core gamers PS3 quality graphics is more than enough (and will be still in another 5 years time) for the vast majority of games we'd want to play.

We're beginning to hit a point of diminishing returns on graphics anyway- you're always going to be limited by what the eye can process, and the ability of the artists.... sure when 3D goes mainstream and is built into our monitors all of a sudden graphics cards will need to be more powerfull.

I personally can't think of one game I have played in the last 5 years where going beyond PS3 quality graphics would have improved the game for me.

Now, I'm not hard-core, and don't play the shoot-em up first person genre which tend to be the most gpu intensive- but people like me make up an increasingly significant portion of the game-market.

Look at how successfull simple things like Angry Birds can be.

Re:Resolution! (1)

jeffmeden (135043) | more than 2 years ago | (#38013264)

In 2 years a phone with a 1080p display is a likely reality. We already have phones/tablets running at/near 1280x720 which is 50% of the 1080p pixel count. But to say that it would be acceptable on the high end PC side is a stretch, in 2 years we will probably have desktop expectations beyond 1080p. Entry level to mid market could see a benefit though, that market has been under served by horrible attempts at "integrated" graphics for years. It will be interesting to see if this GPU compares to the beefed up CPU/GPU integrations coming from AMD. The new frontier (aside from serious game enthusiasts) is smaller and greener.

Re:Resolution! (2)

phoenix_rizzen (256998) | more than 2 years ago | (#38013708)

Samsung Galaxy Nexus has a 1280x720 screen. And most Android 4.0 devices coming out in the next 12 months will include 1280x720 screens.

And pretty much every Android device released this year includes a mini-HDMI port for connecting to 720p and 1080p screens.

IOW, current and future Android phones can already do what you think they can't.

Re:Resolution! (0)

Anonymous Coward | more than 2 years ago | (#38013804)

Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

The last generation of Mobile devices already support higher than 1280x720. The current generation already supports higher than 1920X1080. The first Xoom is had a resolution of 1280X800. The Xoom II has a resolution of 2048x1536.

The PS3 doesn't support as high resolution graphics as an Android Tablet that doesn't even use the Tegra 3. Kinda sad.

Re:Resolution! (0)

Anonymous Coward | more than 2 years ago | (#38014110)

You realize that in most cases PS3 is *not* "doing it" at 1080p. In fact, many games drop even below 720p.

More information here... (5, Informative)

allanw (842185) | more than 2 years ago | (#38012750)

One fly in the oitment, though (-1)

Anonymous Coward | more than 2 years ago | (#38013458)

British computer chip designer ARM has unveiled its latest graphics processing unit (GPU) for mobile devices.

Is that a "Electrics by Lucas" sticker I see there?

No (4, Informative)

Baloroth (2370816) | more than 2 years ago | (#38012758)

The PS3 is 5 years old and based on even older graphics tech. Beating that on mobile is cool, but not surprising. The PS3 never was impressive, graphically, to PC users. Who had better than HD resolutions for years. Some console games are still limited to 720P. Oh, and people had 3D on PC like, 8 years ago (or more.) Sucked then, sucks now.

Re:No (4, Informative)

Nemyst (1383049) | more than 2 years ago | (#38013128)

Some? Make that most. You can count on two hands 1080p, 60 fps games on both 360 and PS3, with most being 2D games that don't need any sort of graphical power to run.

Re:No (1)

AdamJS (2466928) | more than 2 years ago | (#38013710)

Having the equivalent of a 7600GT in a super low power mobile form factor would be great, especially considering the actual demands (resolution/AA) would be lower anyways.

Sure, in the same sense than PS2 is akin to PS3 (1)

Anonymous Coward | more than 2 years ago | (#38012800)

The current gen Mali drives graphics that look like they come out of a PS1. If you look at the graph in Anandtech article you'll see that they mean 10x faster than the low end Mali, not what we have in the GS2. So that would mean the next gen part might be able to be like a PS2, though I think that's wildly overestimating performance.

Tegra 5 (1, Troll)

backslashdot (95548) | more than 2 years ago | (#38012830)

nVidia is commited to releasing a new Tegra chip every year. The Tegra 3, which is already out is 5x faster than Tegra 2 (which beats the Mali 400 which is at 1/10th the speed of the GPU ARM announced). So basically, by the time this ARM CPU is released .. Tegra 5 will be out .. and going by the roadmap of how fast Tegra 5 will be .. it will run at least 5x times faster than ARM's chip.

I hope ARM prices this cheap dirt cheap .. so that sub $200 (off contract) phones can have it.

Re:Tegra 5 (2)

Andy Dodd (701) | more than 2 years ago | (#38012916)

The Tegra 2's GPU is NOT that hot.

Hell it can't even play H.264 Main/High profile video at 720p. The Mali-400 has no problem with this.

(I own a Tegra2 device and an Exynos device with a Mali-400 - in almost any workload, the Exynos utterly dominates the Tegra2 despite the CPU only being clocked 20% higher.)

Re:Tegra 5 (0)

Anonymous Coward | more than 2 years ago | (#38012980)

I have a viewsonic gtab and it can play at 720p using mobo player. What are you using?

Re:Tegra 5 (1)

sed quid in infernos (1167989) | more than 2 years ago | (#38013136)

Hell it can't even play H.264 Main/High profile video at 720p.

Both my Transformer and my Xoom have been able to play H.264 Main/High profile since Android 3.1 came out. The original problem was caused by software problem, not hardware. Link [xda-developers.com] .

Re:Tegra 5 (4, Insightful)

abigsmurf (919188) | more than 2 years ago | (#38012930)

Doesn't Tegra have major heat issues that stop it from being in anything smaller than tablets?

Both Sony and Nintendo considered using it for their new consoles but the heat and power usage apparently made them turn away from it.

3DS is backwards compatible with the DS (3, Informative)

tepples (727027) | more than 2 years ago | (#38013268)

Both Sony and Nintendo considered using it for their new consoles but the heat and power usage apparently made them turn away from it.

And Nintendo ended up using something just as hot and power-hungry for the 3DS. As I understand it, the reason Nintendo ditched Tegra for the 3DS had everything to do with the fact that Tegra wouldn't work with an ARM9 core (ARMv5), and Nintendo needed something cycle-accurate to the ARM946E in order to play DS and DSi games without glitches.

Re:Tegra 5 (1)

UnknowingFool (672806) | more than 2 years ago | (#38013030)

How many companies use ARM's GPU? nVidia uses their own GeForce. Qualcomm uses Adreno. TI and Apple use PowerVR. Samsung uses PowerVR and ARM. But as far as I know they are the only ones that use ARM.

Re:Tegra 5 (0)

Anonymous Coward | more than 2 years ago | (#38014082)

There are TONS of manufacturers that license and produce ARM parts or use ARM cores in their products and not limited to your list. eg. Novoton, NXP, ST, Xilinx, Altera.

Almost all microcontroller vendors have ARM based 32-bit products (if not, it would be its competitor such as MIPS) now. Even your $20 made in China DVD player has an ARM core inside along with a 8051 8-bit core.

Re:Tegra 5 (0)

Spykk (823586) | more than 2 years ago | (#38014074)

You should be careful with extrapolation [xkcd.com] ...

Yea right (4, Insightful)

nedlohs (1335013) | more than 2 years ago | (#38012838)

In 2 years time the PS3 will be 7 years old.

The PS2 was 7 years old in 2007. Were PS2 level graphics acceptable for "high end PCs and consoles" in 2007?

No? Then why would PS3 level be acceptable in 2013?

Re:Yea right (4, Interesting)

Jeng (926980) | more than 2 years ago | (#38012934)

Because we are getting to the point in technology that us humans won't be able to perceive the difference in graphics.

You can only make something so lifelike, after that you might as well aim at efficiency.

Re:Yea right (4, Informative)

0123456 (636235) | more than 2 years ago | (#38012990)

Because we are getting to the point in technology that us humans won't be able to perceive the difference in graphics.

Hollywood is getting close, but they have huge render farms, terabytes of source data and can spend hours rendering a single frame. GPUs are still a long way from producing photo-realistic output.

Hollywood has bigger screens to fill (1)

brokeninside (34168) | more than 2 years ago | (#38013102)

The computational cost of filling a 5, 7 or 10 inch screen for a mobile device with a photo-realistic image is far smaller than doing the same for a twenty foot tall movie screen.

Even if your mobile device gets plugged into an HDTV, it's still nowhere near the same problem that Hollywood faces creating output that will be shown in theaters with basketball court sized screens with the front row ten feet away.

Re:Hollywood has bigger screens to fill (4, Informative)

Nemyst (1383049) | more than 2 years ago | (#38013164)

4K is only four times the pixels as standard 1080p video. There is still no way for realtime rendering of Pixar-like stuff in the near future, be it on mobiles or desktops.

Re:Hollywood has bigger screens to fill (3, Interesting)

TheRaven64 (641858) | more than 2 years ago | (#38013850)

The more important limitation is not human perception, it's cost. Remember the models in Quake? Remember the mods? A fairly competent 3D artist could knock out something like the Quake guy in a day or two. Now compare that to a modern game. A single tree in a modern FPS has more complexity than every model on a Quake level combined. That all translates to vastly more artist time, which translates to greater expense. For a Pixar film, you can spend a huge amount developing and texturing every model, but for a game the upper limit is a lot lower.

Re:Yea right (1)

geekoid (135745) | more than 2 years ago | (#38013246)

" GPUs are still a long way from producing photo-realistic output." In a reasonable time.

Re:Yea right (2)

Pulzar (81031) | more than 2 years ago | (#38013726)

Because we are getting to the point in technology that us humans won't be able to perceive the difference in graphics. You can only make something so lifelike, after that you might as well aim at efficiency.

Is there a single game out there that's so lifelike that you can't perceive the difference between it and a real video?

There's plenty more room for improvement, we're not getting anywhere close to that point.

Re:Yea right (1)

Twinbee (767046) | more than 2 years ago | (#38014146)

A few words for you: Global illumination, path-tracing, trillions of particles, atom worlds, AI.

See 5, 3 and 2 from this page:
http://www.skytopia.com/project/cpu/cpu3.html [skytopia.com]

Also remember that "lifelike" isn't necessarily an ideal, and that there are things we can see which far exceed the mundane visuals you can get from the relatively dull world we inhabit.

Re:Yea right (0)

Anonymous Coward | more than 2 years ago | (#38012972)

Were PS2 level graphics acceptable for "high end PCs and consoles" in 2007?

You omitted the "ultra-low power" part of that quote.

Re:Yea right (0)

nedlohs (1335013) | more than 2 years ago | (#38013134)

It's irrelevant to the point, which is if that level of graphics is considered "high end".

Obviously you use the lowest power you can get away with (since heat is painful to deal with - especially for a console that you don't want to sound like a jet taking off). What you can get away with and still be called "high end" is the only question.

Re:Yea right (2)

LWATCDR (28044) | more than 2 years ago | (#38013212)

High end was a dumb thing to add. PCs in general yes. If can pump out 1080p it will be good enough for 99.7% of current PC users. Are people going to run CAD or high end video games on it? Probably not.
Gamers just don't seem to get just how small of a percentage of PC users they are. For a good long time PCs will probably be stuck at 1080p for the majority of monitors since TVs will keep the cost of the panels low for a good while.

Re:Yea right (1)

nedlohs (1335013) | more than 2 years ago | (#38013896)

Sure, but "consoles" was also there making "high end" PCs the correct subset to use. Mind you my PC plays games just fine and isn't what I would call high end...

Re:Yea right (1)

LWATCDR (28044) | more than 2 years ago | (#38014012)

Not really. The Wii was not start of the art when it came out and did very well. I don't hear people screaming for better graphics than the PS/3 or the 360. Combine that with the rise of casual games and yes it could run a console well enough for many users. The high end market could and frankly is shrinking. You can get good video cards and I do mean good cards for around $120 now that will run games very well on the average monitor. You only need the high end cards for 27" high resolution monitors like the Apple Thunderbolt monitor and others that share that resolution.

Re:Yea right (0)

Anonymous Coward | more than 2 years ago | (#38014170)

Don't forget that the launch date of a console is not state of the art. The hardware, particularly GPU is already well over a year old.

graphics, star trek, and the post-PC era (3, Insightful)

Anonymous Coward | more than 2 years ago | (#38012842)

There was a story on CNN a few weeks ago that said that while PC sales are slowly increasing in the entire world, it's very tilted, and they are falling dramatically in the US, Canad, and Europe. The increase is coming from the developing world being able to afford computers as they fall in price.

The culture shift from desktop computing to mobile is happening in part because mobiles are becoming powerful enough to do most of the tasks that desktops used to do. OK, you'll always get a few neckbeards to say "But the cell phone can't run AutoSuperCadShop 390322.55!" But that misses the point. That's not what 99.9% of consumers DO with their computers. They play some games, including 3D games, they check their facebooks, they look at some news headlines, and so on. All that works fine on a device that they can fit in their pocket. For those times a keyboard is needed, a bluetooth keyboard will do just fine. And for those times a larger screen is needed, a wireless connection to the screen will work fine.

I don't know why people can't see this shift happening right in front of their eyes. Even the sales data bears it out now: mobile computing is on the upswing, and in the western world, PC sales are falling. It's a nice world: Instead of needing to lug around a desktop or even a netbook, you'll have the capability of a (say) 2009 vintage desktop in your shirtpocket in 2014. A 2009 desktop is nothing to sneeze at, and meets the needs of 99% of the population just fine. The rest will become a more expensive niche, but will still exist in some way.

It's a Star Trek Future.

Re:graphics, star trek, and the post-PC era (1)

0123456 (636235) | more than 2 years ago | (#38013044)

The culture shift from desktop computing to mobile is happening in part because mobiles are becoming powerful enough to do most of the tasks that desktops used to do.

No, they're not. They're becoming powerful enough to check your email and play Farmville, which is all that many people used to do with their PCs; they're not much good for actual productive work.

Meanwhile PC gaming has stagnated due to Microsoft concentrating on pushing console games, so there's little reason for the average home user to upgrade. Word won't let you write stuff ten times faster just because you switched from a Pentium-4 to an i7, and when games are limited by being designed for an Xbox and them ported over, your super-fast GPU will be sitting idle much of the time waiting for something to do.

Re:graphics, star trek, and the post-PC era (0)

Anonymous Coward | more than 2 years ago | (#38013302)

Yes they are. Lots of useful work was done on an 80286 platform. Your mobile computer has much more CPU / GPU power, a higher screen resolution, and way more RAM.

Re:graphics, star trek, and the post-PC era (-1)

Anonymous Coward | more than 2 years ago | (#38013332)

"x86 PCs are not powerful enough to do real work: for that, you need a 68000. The 68000 is not powerful enough to do real work: for that, you need a mainframe".

Keep on dreaming, dude. The world is changing. You either accept it and move with it, or you get left behind. Mobile is THE next big thing.

Re:graphics, star trek, and the post-PC era (1)

ksd1337 (1029386) | more than 2 years ago | (#38013350)

your super-fast GPU will be sitting idle much of the time waiting for something to do.

One word: Crysis.

Re:graphics, star trek, and the post-PC era (0)

Anonymous Coward | more than 2 years ago | (#38013442)

Fuck Crysis. It was nothing more than an entry in the dick-showing contest that is PC gaming. The storyline sucked, the gameplay sucked. Only thing that didn't suck was the graphics.

Also, Crysis came out in 2008. It's three year old tech and really isn't impressive anymore.

Re:graphics, star trek, and the post-PC era (2)

UnknowingFool (672806) | more than 2 years ago | (#38013942)

I think his point which you missed is that checking email and playing FarmVille is what the majority of consumers do. Most of them are not playing leet games or rendering animation. In businesses, they might write in Word or crunch a few numbers in Excel. It doesn't take a quad-core Core i7 to do that. The stagnation comes from the fact that a desktop made 5 years ago will handle the majority of their tasks and mobile computing is approaching the point where they handle a good majority. Also mobile computing is starting to allow consumers to do things they didn't do before. How many people do you know watched movies on their computer via Netflix compared to the numbers that watch it on their tablets.

Re:graphics, star trek, and the post-PC era (0)

Anonymous Coward | more than 2 years ago | (#38014116)

Uh, a desktop from 15 years ago can handle most of the tasks that common people need.

Aside from intense flash crapplications, but flash has always been a clusterfuck of inefficiency. Video can be handled by dedicated decode-hardware video cards far more efficiently than in a general purpose CPU. Everything else is trivial in a well-designed system running on a Pentium 133.

For commoners, needless software bloat has been driving the hardware industry, not actual productivity.

For a power user re-encoding 30-bit h264 video or running raytracing software, something a bit more powerful is needed.

Re:graphics, star trek, and the post-PC era (0)

Anonymous Coward | more than 2 years ago | (#38013230)

You probably are talking about this:

"NEW YORK (CNNMoney) -- The trend is clear: Personal computer sales are slumping, and smartphone and tablet sales are booming. But Intel proved late Tuesday that the PC isn't going away anytime soon. Consumer demand for PCs in the United States, Canada and Europe is slumping badly, but consumers in emerging markets can't get enough of them, Intel said."

This shift will be slow, but it WILL happen. It'll drag the people afraid of change kicking and screaming along with it, just as has happened before in many other technology shifts. Sell your buggy whips already, it's time to move on.

Re:graphics, star trek, and the post-PC era (1)

VocationalZero (1306233) | more than 2 years ago | (#38014132)

The neckbeards created the demand (and supply) for the personal computer in the first place, so I'm sure they can keep it afloat.

Well, no (1)

Sockatume (732728) | more than 2 years ago | (#38012890)

When we have handhelds as powerful as the PS3 (the Vita is getting there), we'll have much more powerful PCs and a new generation of consoles.

Re:Well, no (1)

Locutus (9039) | more than 2 years ago | (#38013112)

the story isn't about the handhelds matching desktops, it's about the handhelds getting some very powerful graphics. besides, the reference was with consoles, not desktops. Just because consoles in a few years might be doing holographic displays it doesn't mean handhelds doing pretty nice 3D graphics on battery power isn't nice too.

LoB

Why is this that shocking? The cell chip is 5 (1)

jeffmeden (135043) | more than 2 years ago | (#38012994)

Five years ago tomorrow the PS3 made it's debut, did you think that in the mean time everyone just sat back and basked in the glory of its infinite capabilities? Two years from now (if that pans out) will be 7 years since the commercialization of the Cell chip, so seeing a miniature version that uses dramatically less power is pretty much par for the course. Desktop chips that have similar (or more specific) capabilities are already available in many products. Remember, the first PS3 drew an amazing 200 watts at full load, and within 2 years that was more than cut in half. This is just more progress, and *promised* progress at that. Hey ARM, why not just say you will have a flying car in 2 years?

Re:Why is this that shocking? The cell chip is 5 (1)

Locutus (9039) | more than 2 years ago | (#38013168)

good points but we are also talking about things in the single digits for power consumption. I agree, die shrinkage and advances in designs give lots of power savings. Still, having PS3 like graphics on a handheld will be nice.

LoB

Re:Why is this that shocking? The cell chip is 5 (1)

heinousjay (683506) | more than 2 years ago | (#38013316)

Where in the story was it promised this was shocking? We all get that you're a supergenius who could run the entire technology sector single-handed and better than it's done now, but those of us who understand that the Cell chip isn't even the point here might be interested in what's coming next, you know?

Is it a double precision SIMD FPU? (0)

Anonymous Coward | more than 2 years ago | (#38013190)

Is it a double precision SIMD FPU?

Screen Size will make the deal work. (1)

inhuman_4 (1294516) | more than 2 years ago | (#38013736)

I think this will make a huge difference in mobile gaming because of screen size. Assuming that this thing outputs to 720p like the Nexus Galaxy, I think this will be a big thing.

While the PS3 graphics are old and crappy compared to what a modern PC can do, don't forget about screen size. Seeing 720p on a 40 inch screen is a lot different than seeing 720p on a 5 inch screen. The best example of this is fonts that look fine at 5 inches will look like crap expanded to 40 inches. Artifacts and jaggedness on 40 inch are going to be pretty minimal on a 5 inch. We are talking about shrinking by almost a factor of 10. At some point the quality of the output will exceed our eyes ability to notice the difference.

Of course this will do nothing to improve what the chip can render in terms of complex environments, smoke etc. But at 5 inches it is not hard to have too much on the screen.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?