Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Challenges Apple's iPad Benchmarks

Soulskill posted more than 2 years ago | from the insert-poker-analogy dept.

Graphics 198

MojoKid writes "At the iPad unveiling last week, Apple flashed up a slide claiming that the iPad 2 was 2x as fast as Nvidia's Tegra 3, while the new iPad would be 4x more powerful than Team Green's best tablet. NVIDIA's response boils down to: 'it's flattering to be compared to you, but how about a little data on which tests you ran and how you crunched the numbers?' NVIDIA is right to call Apple out on the meaningless nature of such a comparison, and the company is likely feeling a bit dogpiled given that TI was waving unverified webpage benchmarks around less than two weeks ago. That said, the Imagination Technologies (PowerVR) GPUs built into the iPad 2 and the new iPad both utilize tile-based rendering. In some ways, 2012 is a repeat of 2001 — memory bandwidth is at an absolute premium because adding more bandwidth has a direct impact on power consumption. The GPU inside NVIDIA's Tegra 2 and Tegra 3 is a traditional chip, which means it's subject to significant overdraw, especially at higher resolutions. Apple's comparisons may be bogus, but Tegra 3's bandwidth issue they indirectly point to aren't. It will be interesting to see NVIDIA's next move and what their rumored Tegra 3+ chip might bring."

cancel ×

198 comments

Sorry! There are no comments related to the filter you selected.

This is funny. (5, Funny)

imagined.by (2589739) | more than 2 years ago | (#39318537)

The irony in this is that this is coming from a company that presented chunks of wood as their next-gen graphics cards.

Re:This is funny. (4, Insightful)

arbiter1 (1204146) | more than 2 years ago | (#39318617)

yea its also irony that claims their chip is 4x faster was known to cheat their benchmarks years ago to make their systems look faster then they were.

Re:This is funny. (5, Insightful)

DJRumpy (1345787) | more than 2 years ago | (#39318755)

One also has to consider that the older iPad 2 smeared the floor with the Tegra 3, why would they think that twice the performance is 'meaningless'? Considering Apple typically doesn't play too lose with the marketing statistics for metrics like battery life, real world performance, etc, then I don't find this to be a stretch. I will be interesting to see the real world benchmarks when the hardware arrives.

http://hothardware.com/Reviews/Asus-Eee-Pad-Transformer-Prime-Preview/?page=7 [hothardware.com]

Re:This is funny. (5, Interesting)

Anonymous Coward | more than 2 years ago | (#39318851)

Smeared the floor with Tegra 3? I'm sorry, but meaningless benchmarks are meaningless. I hold both Tegra 2 (Ice Cream Sandwich) and iPad 2 devices in my hand at this very minute, and I can tell you that there is essentially no noticeable difference between the two in terms of responsiveness or 3D performance from the point of view of the end user (and that's despite the iPad 2 having a significantly lower-resolution screen than the Tegra 2 device. The latter has 30% more pixels than the iPad 2 does.)

For the iPad 2 to "wipe the floor" with Tegra 3, it would have to be significantly slower than Tegra 2, and it isn't. Hence, these benchmarks can be nothing other than complete nonsense.

Re:This is funny. (2)

jo_ham (604554) | more than 2 years ago | (#39318925)

But your highly scientific benchmark is?

Re:This is funny. (4)

poly_pusher (1004145) | more than 2 years ago | (#39319009)

Since you have both, Could you run the OpenGL Egypt benchmark for comparison?

Re:This is funny. (1)

MobileTatsu-NJG (946591) | more than 2 years ago | (#39319067)

Ummm... Are you testing this by playing the same game on both?

Re:This is funny. (1)

Anonymous Coward | more than 2 years ago | (#39319161)

How well does Infinity Blade play on Android compared to iOS?

Re:This is funny. (0)

Anonymous Coward | more than 2 years ago | (#39319265)

Tegra 2 or 3 liar?

Re:This is funny. (0)

Anonymous Coward | more than 2 years ago | (#39319841)

Yet Tegra2 is the only SoC that doesn't support Neon, making it pretty much useless. Typical proprietary NVidia shit. Oh, and zero documentation available as usual.

Re:This is funny. (3, Insightful)

cheesybagel (670288) | more than 2 years ago | (#39318869)

Apple doesn't play too lose with marketing statistics? You simply are forgetting the late PowerPC times where a water-cooled Apple system was slower than an air cooled Intel PC.

Re:This is funny. (1)

DJRumpy (1345787) | more than 2 years ago | (#39318891)

Is there some commercial or ad you are referring to?

Re:This is funny. (3, Informative)

Narcocide (102829) | more than 2 years ago | (#39318919)

No, he's referring to a conspicuous weakness of the final lines of (quad-core, btw) G5 macs compared to the company's own first competing Intel offerings. Another not-so-well-known weakness is that they also drew more juice under load than most full-sized refrigerators.

Re:This is funny. (2)

Anonymous Coward | more than 2 years ago | (#39318947)

I believe the parent is pointing to the fact that Apple doesn't appear to have put out any misleading marketing materials claiming that PPC was dominating Intel's chipsets on the G5. Is there some marketing benchmark out there that Apple lied about?

you're an idiot (0)

Anonymous Coward | more than 2 years ago | (#39319679)

apple said its mac pro was 2.5x faster than its last g5. lets do the math....4x2.5GHz (last g5) =10 total GHz, next mac pro was 8x3.0GHz , 24GHz...funny, almost a linear scaling with MHz. Yeah intel was so much faster. They switched because of portables. Couldn't jam a water-cooled g5 into portable.

Re:This is funny. (1)

milkmage (795746) | more than 2 years ago | (#39318941)

not sure about the water cooled system comment, but they did add "4G" to the iphone 4S with the OS patch.. everyone knows the software didn't upgrade the hardware to LTE.

Re:This is funny. (2)

jo_ham (604554) | more than 2 years ago | (#39319061)

I guess it depends what the carriers are calling "4G". I assume the menu displays whatever the carrier has termed 4G, since the 4S supports most of those "3.5G" technologies that have been rebranded as 4G.

Although sometimes software upgrades can upgrade hardware - remember the enforced-charged-$1.99-SO 802.11n patch for some early systems with draft-n support but no software support when they came out? (yes, yes, I know that's not what has happened with the 4S)

Re:This is funny. (1)

milkmage (795746) | more than 2 years ago | (#39319187)

my point is they changed it with a software update.. whereas the non-4G but marketed as 4G Android phones have always been that way.

shetchy marketing if you ask me

(whether or not ATT was behind the decision doesn't matter because Apple included it in THEIR software update)

Re:This is funny. (2)

Relayman (1068986) | more than 2 years ago | (#39319315)

AT&T convinced Apple that HSPA+ was 4G. That's all.

Re:This is funny. (2)

Belial6 (794905) | more than 2 years ago | (#39319793)

Did the battery die when they did that? Because the Apple fanboys were insistent that 4G drained batteries faster, and that was why Apple didn't support it.

Re:This is funny. (0)

Anonymous Coward | more than 2 years ago | (#39319913)

Maybe the battery life difference is actually related to the technology used (LTE as opposed to HSPA+) and not the data transfer speed, and the Apple fanboys were complaining that LTE kills battery life [zdnet.com] ?

Re:This is funny. (1)

arbiter1 (1204146) | more than 2 years ago | (#39318917)

Apple also in those days would use a benchmark utility that was optimized for their chip, and Intel/AMD based computer they competed against just use some program they just downloaded from the web that has almost 0 optimization for the cpu its on.

486 code on a Pentium, not water cooled (1)

perpenso (1613749) | more than 2 years ago | (#39319027)

Apple doesn't play too lose with marketing statistics? You simply are forgetting the late PowerPC times where a water-cooled Apple system was slower than an air cooled Intel PC.

That is a bogus point. Those water cooled G5s were the standard shipping system. Its entire fair to compare a stock Mac against a stock PC.

The real "engineering" of the PPC vs x86 comparison was through the benchmarking utility. IIRC Apple used a very old version of ByteMarks that was compiled/optimized for the 486 even though they were running on a Pentium at the time. When ByteMarks was recompiled to optimize for the Pentium the PPC advantage faded.

Re:This is funny. (0)

Anonymous Coward | more than 2 years ago | (#39319081)

Apple aren't they the scumbags who have ads pulled by the Advertising Standards Authority for being liars?

Re:This is funny. (4, Funny)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39318653)

The irony in this is that this is coming from a company that presented chunks of wood as their next-gen graphics cards.

Hey! Some of us care about 'Green Computing' here, you earth-raping performance whore!

Re:This is funny. (1)

Surt (22457) | more than 2 years ago | (#39318797)

Sure, wooden graphics cards don't use as much electricity, but they have to cut down trees to make them. All in all, the wooden graphic cards are actually worse for the environment.

Re:This is funny. (2)

K. S. Kyosuke (729550) | more than 2 years ago | (#39318883)

wooden graphic cards

We call them "drawing boards" where I live.

Re:This is funny. (1)

Cute Fuzzy Bunny (2234232) | more than 2 years ago | (#39319181)

Yes, but you can huck them into a wood chipper or the recycle bin when they're old.

Re:This is funny. (1)

Surt (22457) | more than 2 years ago | (#39319581)

That just releases the carbon faster!

Re:This is funny. (1)

Cute Fuzzy Bunny (2234232) | more than 2 years ago | (#39319649)

Apparently you dont have a downdraft fan on your recycle bin.

Re:This is funny. (1)

TeknoHog (164938) | more than 2 years ago | (#39318879)

Hey! Some of us care about 'Green Computing' here, you earth-raping performance whore!

Which is why I only use Radeon HD 5xxx [wikipedia.org] cards.

Re:This is funny. (4, Informative)

mTor (18585) | more than 2 years ago | (#39319075)

The irony in this is that this is coming from a company that presented chunks of wood as their next-gen graphics cards.

I had no idea what you were talking about but a quick search showed this:

http://semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc/ [semiaccurate.com]

LOL... Nvidia faked a graphics board with a piece of PCB-looking plastic/wood that was screwed to the side of a PC with common hardware-store grade wood screws.

Re:This is funny. (2, Insightful)

hairyfeet (841228) | more than 2 years ago | (#39319875)

Oh and don't forget the classic "Quack.exe', I LOL quite hardily at that one. but seriously who gives a shit? lets be honest folks, Apple could crap in a bag and it'll sell millions, its one of those brands like Nike and Prada that are simply gonna sell to the faithful no matter what. Does anyone seriously believe the average Apple customer is gonna walk away from an iPad for an Android ANYTHING, just because of specs? Lets face it Nvidia could invent the holy grail of mobile chips and it wouldn't make a dent in iPad sales as its got the network effect X10. when people see celebs like Stephen Colbert holding up their iPads going "look at what i got, check it out!" it DOES affect their buying other brands. Apple has spent years cultivating the cool hipster angle and those that have bought into the brand simply aren't gonna settle for some Samsung because its got an Nvidia chip.

Before the Appleites start spewing their bile not saying its a bad brand, personally i think its overpriced and the mobile phone subsidies really saved their sales, but its not a bad device. its just that the majority of the people buying these things couldn't name the specs or features if you put a gun to their head, all they know is its Apple and apple is cool.

Numbers are meaningless (5, Insightful)

blahbooboo (839709) | more than 2 years ago | (#39318543)

Does using the tablet have smooth and instant responsiveness? At the end of the day, that's all that matters. Tegra 100 or ipad 100 won't matter if the OS that uses it isn't smooth and keeps up with the user interactions. Consumers just care about experience, how they get there isn't of interest to anyone other than nerds.

Re:Numbers are meaningless (2)

bemymonkey (1244086) | more than 2 years ago | (#39318575)

What about iOS/Android gamers? Some of those games are pretty taxing and require pretty heavy-duty GPUs to run smoothly...

Re:Numbers are meaningless (4, Interesting)

UnknowingFool (672806) | more than 2 years ago | (#39318675)

From what we know the A5X is pretty much the same as the A5 except it uses 4 PowerVR SGX543 cores instead of 2. Now this 4 core GPU configuration is the same as the PS Vita albeit the Vita uses a 4 core ARM as the CPU and the Vita runs a smaller 960 × 544 qHD screen. Comparatively, the Vita should beat the iPad on gaming given the hardware for intensely graphic games. For Angry Birds, it may not make much of a difference. At the present time, we don't know if Apple tweaked the A5X in other ways to boost game performance.

Re:Numbers are meaningless (4, Informative)

samkass (174571) | more than 2 years ago | (#39318725)

From what we know the A5X is pretty much the same as the A5 except it uses 4 PowerVR SGX543 cores instead of 2. Now this 4 core GPU configuration is the same as the PS Vita albeit the Vita uses a 4 core ARM as the CPU and the Vita runs a smaller 960 × 544 qHD screen. Comparatively, the Vita should beat the iPad on gaming given the hardware for intensely graphic games. For Angry Birds, it may not make much of a difference. At the present time, we don't know if Apple tweaked the A5X in other ways to boost game performance.

The "New iPad" also has twice as much RAM as a Vita (1GB vs 512MB), which could make a significant difference to practical gaming capability. As you note, as well, we have no idea what else Apple tweaked in the chip. Combined with the difficulty in an apples-to-apples comparison between two very different devices, it'll be hard to ever know how different the raw specs are. I think it's reasonable to say, though, that the "New iPad" will be excellent for gaming, as will a Vita.

Re:Numbers are meaningless (1)

Anonymous Coward | more than 2 years ago | (#39318911)

RAM doesn't matter. At least that what they said last year when their iPad and iPhone had only 512MB while the competition had 1GB.
Also CPU speed to not matter. It's all about the GPU. Except for the iPhone 4, which had a crappy GPU. So during the iPhone 4's lifespan, GPU didn't matter, it was all about the number of pixel. Sorry, I meant the pixel density.

Re:Numbers are meaningless (4, Interesting)

Narishma (822073) | more than 2 years ago | (#39318913)

The Vita also has 128MB of dedicated VRAM which the iPad (or any other smartphone or tablet for that matter that I'm aware of) doesn't, making things even more difficult to compare.

Factor in the display changes as well (1)

SmallFurryCreature (593017) | more than 2 years ago | (#39319327)

The Vita has a far smaller screen with a fraction of the pixels, that skews it even further. Then again, the Vita has to process more inputs.

Re:Numbers are meaningless (-1)

Anonymous Coward | more than 2 years ago | (#39318757)

True, however no one bought a Vita.

Re:Numbers are meaningless (4, Interesting)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39318681)

Given Apple's (relative) hardware homogeneity(certainly more than Android; but the steadily accumulating pile of older iDevices is inevitable and not going away just yet...), I assume that iOS games will largely tax the GPU as hard as possible; but not try overshooting(just as console games generally push right to the edge, since the edge is a known quantity). It will be interesting to see if the new 'retina display' ipads end up seeing titles that sacrifice complexity in other areas to push native resolution, or whether we'll see a lot of 'well, it's smoothly upsampled; but fundamentally the same resolution as the iPad N-1' stuff...

One thing that I don't think has come up yet; but would be interesting to see, is whether Nvidia tries to turn their disadvantage into a bonus by doing more aggressive power scaling...

If, as TFA suggests, Tegra parts are held back by memory bandwidth; because faster busses are power hungry, this suggests that they might be able to substantially speed-bump their parts when the device is on AC power or otherwise not power constrained. So long as the switchover is handled reasonably elegantly, that could turn out to be an advantage in the various HDMI dock/computer replacement/etc. scenarios...

Re:Numbers are meaningless (1)

negRo_slim (636783) | more than 2 years ago | (#39318711)

How about the much derided Flash? I have a Sony Tablet S [wikipedia.org] which sports a Tegra 2 and it rarely gets used for gaming aside from the kid playing Angry Birds. But it does get used for YouTube and Comedy Central programming, a lot. However it stutters on most video playback when I visit the non-mobile non-app Youtube site, or if heaven forbid I haven't scrolled precisely to where I can only see the video on The Daily Show/Colbert Report and none of the ads found at the bottom.

I'm still inclined to believe it's poor software rather than poor hardware... but c'mon this really doesn't help to inspire confidence in the Tegra platform for me.

Re:Numbers are meaningless (1)

bemymonkey (1244086) | more than 2 years ago | (#39318973)

Interesting, even the first generation Snapdragon in my old Desire renders Flash Video smoothly... Sounds like Tegra might have a few driver issues...?

Re:Numbers are meaningless (0)

Anonymous Coward | more than 2 years ago | (#39318979)

You do realize this is a dual or single core 1GHz; heavy duty multitasking aren't going to be it's forte compared to a dual / quad 2.4GHz desktop. With Flash, it must go through a browser plugin (which isn't the most efficient, but a necessary compromise for bringing us rich content for the past 10+ years, and the future few years). More than likely, there's a lot of heavy flash content on the page.

I have a Nexus One which is barely able to play the web version / demo of Plants vs Zombies. If I try to play it without isolating just the game (last I checked, there were a couple of ads above and below), it becomes quite a bit laggier.

Try longpressing the flash window application and maximizing it. It might help (that's not what I did above; I actually looked at the webpage for the SWF file and went to that URL directly, not maximizing it)

Re:Numbers are meaningless (1)

marsu_k (701360) | more than 2 years ago | (#39319219)

That's odd, I regularly watch the flash-based versions of The Daily Show and Colbert on my Transformer (the original one, so Tegra 2) and it seems to play them just fine, fullscreen also. Then again, the version of Android that Asus ships is pretty much plain vanilla, I don't know how much "enhancements" Sony has added to it.

Re:Numbers are meaningless (1)

locopuyo (1433631) | more than 2 years ago | (#39319603)

Which browser are you using? It probably is a browser performance issue.

Re:Numbers are meaningless (2)

pankkake (877909) | more than 2 years ago | (#39318637)

So why did Apple show their benchmark?

Re:Numbers are meaningless (0)

blahbooboo (839709) | more than 2 years ago | (#39318749)

For people like you who care about such things, the nerds.

Every other person I know interested in the ipad 3 have no idea about anything other than it's new :) . I had to explain to a few of the cheaper people that the newer one is worth the $100 extra over the old model since they intend to read and the new model has a "prettier/better quality screen for reading."

Re:Numbers are meaningless (1)

Surt (22457) | more than 2 years ago | (#39318819)

I'm doubtful that the $100 is worthwhile for anyone who intends primarily to use the ipad as a reader. The screen on the ipad 2 is 'good enough' for that.

Re:Numbers are meaningless (2)

jo_ham (604554) | more than 2 years ago | (#39318957)

If the new iPad's screen compared to the iPad 2 is the same delta as the 3GS > 4 switch for the iPhone, only at 9.7" then it absolutely is worth the extra $100 if you intend to do a lot of reading on it.

The high dpi on the iPhone 4+ screen is extremely good for reading text, more than almost any other benefit (I assume that HD movies will also be a big thing on the iPad, unlike the iPhone).

Re:Numbers are meaningless (0)

Anonymous Coward | more than 2 years ago | (#39319639)

The high dpi on the iPhone 4+ screen is extremely good for reading text, more than almost any other benefit (I assume that HD movies will also be a big thing on the iPad, unlike the iPhone).

You fell into Apple's marketing. It's not high DPI that makes text readable. It's high resolution. Text is more reable on a 1080p TV than on an iPhone 4, because there are more pixels. More pixels means more clear text that can be shown at once.

Re:Numbers are meaningless (2)

jo_ham (604554) | more than 2 years ago | (#39319831)

You are looking for ways to make Apple's marketing look bad, but failing.

High dpi at a small physical size already means high resolution, but I didn't think I'd have to specify that we're not reading the text on one of those jumbo screens (where the same resolution as an iPad would result in a low dpi).

The high dpi of the iPhone 4 screen (compared to the 3GS) is what makes the text readable. Now, you achieve that on a screen of the same physical dimensions by increasing the resolution of the panel, but in terms of how you discuss what has been done (higher dpi vs higher resolution in the same physical dimensions) you are talking about the same thing.

In other words, higher resolution in the same physical size leads to higher dpi. How is this "falling for marketing"?

(We're also assuming vector typography here - I assume that can be taken as read and not explicitly stated, lest you again claim that I'm "falling for marketing")

Re:Numbers are meaningless (1)

MobileTatsu-NJG (946591) | more than 2 years ago | (#39319147)

No, it isn't.

Re:Numbers are meaningless (1)

Surt (22457) | more than 2 years ago | (#39319565)

Seems good enough to me.

Re:Numbers are meaningless (1)

peppepz (1311345) | more than 2 years ago | (#39318825)

Nerds only care about real benchmarks, not marketing speech.

Apple are just keeping their tradition of spectacularizing their keynotes (and their marketing material in general) with abundant use of superlatives and fancy names. Which impress ordinary people much more than nerds (cf. CmdrTaco's reaction to the iPod).

Re:Numbers are meaningless, unless you lie (2)

Overzeetop (214511) | more than 2 years ago | (#39318683)

I like my iPad1, though it's sluggish. I am (too) anxiously awaiting two new iPads due this Friday. I even kept the running commentary on the announcement up in a browser window (yes, I felt a bit dirty afterward). When I heard the proclamation of the speed difference, that certainly seemed to imply a 4-core processing using. At least, that was in the realm of possibility (4 CPU cores and 4 GPU cores vs the Tegra). I'm not convinced now that the claim is valid except for very special conditions with a host of caveats (using 2 CPU + 4 GPU to calculate GPU-assisted functions vs the 4 core Tegra CPU alone).

I agree 100% with your sentiment - and the responsiveness of the UI makes up for a lot of computational shortcomings in iOS devices. In fact, because the devices aren't meant for computationally intensive processes (protein folding, CFD/FEM analysis, bulk media recoding, etc.) the speed of the processor only needs to be fast enough not to be a hindrance to the use flow. Almost all of the media processing is so limited in format on iOS devices that encode/decode can be HW accelerated, precluding the need to do the killer ops in software. So, it may not matter how fast the A5X is, as long as it is "fast enough." Anything faster than real time won't matter to the user as long as it's real time ALL the time. But you can't just go make up numbers.

Re:Numbers are meaningless (-1)

Anonymous Coward | more than 2 years ago | (#39318741)

Nvidia is a bunch of proprietary assholes and I'm never buying anything made by them.

Re:Numbers are meaningless (0)

Anonymous Coward | more than 2 years ago | (#39318837)

Yeah, unlike Apple, which has no closed-down, proprietary OS running on their iPad. They also do no vendor locking at all.
Screw Nvidia, I am buying an iPad for its openness.

Re:Numbers are meaningless (1)

TheRaven64 (641858) | more than 2 years ago | (#39319321)

Different markets. Apple sells locked down devices to consumers. nVidia doesn't even provide their OEM customers with the specs required to write drivers. They either use an nVidia-blessed driver, or none at all. If nVidia decides to stop supporting your tablet, then you can't even complain to the manufacturer about the lack of driver updates (which, given the number of security holes in nVidia drivers in the past, can be important), because they can't do anything about it.

Re:Numbers are meaningless (1)

ReeceTarbert (893612) | more than 2 years ago | (#39318897)

Consumers just care about experience, how they get there isn't of interest to anyone other than nerds.

True, but then why is Apple boasting about 2x, 4x, whatever?

RT.

It's not like tile-based is magic (0)

Anonymous Coward | more than 2 years ago | (#39318583)

It's not like there aren't trade-offs and downsides to using tile-based. In the end, tile-based GPUs will be a footnote in history.

Re:It's not like tile-based is magic (2)

MrLizardo (264289) | more than 2 years ago | (#39320027)

That's also what they said in the late 90's when the PowerVR was competing with the 3Dfx Voodoo add-in cards. Given that there have been at least 50 million PowerVR-based GPUs shipped so far that's a heck of a footnote.

Real data (0)

Anonymous Coward | more than 2 years ago | (#39318591)

Want a look at what the A5X can do? Look at some PSVita games. Same GPU. You can even render at a lower resolution like 1024x768 and put that on the screen full-size.

We have no data to show that Apple didn't further bump up the memory bus size (they doubled it from A4 to A5).

PowerVR, eh? (3, Interesting)

msobkow (48369) | more than 2 years ago | (#39318623)

I didn't know the PowerVR chips were still around. I had one of the early video cards based on the technology for my PC years ago. It worked ok, but that was long before things like shaders were an issue.

Still, we are talking about a portable device, so I'd think battery life would be more important than having the latest whizz-bang shaders. And just look at all the grief people hare having with the Android lineup due to shader differences between vendors.

Thank God I focus on business programming, not video games. I've yet to hear of ANY tablet or smartphone having problems displaying graphs and charts.

Re:PowerVR, eh? (3, Informative)

JimCanuck (2474366) | more than 2 years ago | (#39318691)

PowerVR GPU's are integrated on a lot of ARM processors used by many mobile companies. Its not a secret, but only Apple related articles like to poke fun at it. PowerVR went from being a "brand name" to being the developer behind a lot of graphics on everything from PC's, to game consoles, to HDTV's, to cell phones etc.

For that matter Samsung had been integrating the both of them before the iPhone in any flavor came out. And continues to do so.

Re:PowerVR, eh? (2)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39318693)

PowerVR hasn't shown their face on the PC side in years; but they are something of an 800lb gorilla in power-constrained GPU environments. Not the only player; but a lot of ARM SoCs of various flavors include them. Intel even enlisted them, rather than its in-house designs or the traditional PC guys, for a number of its very-low-power Atom parts...

Re:PowerVR, eh? (1)

UnknowingFool (672806) | more than 2 years ago | (#39318697)

PowerVR left the PC graphics business a long time ago. They however have focused mainly on the mobile devices. Like ARM, PowerVR does not make products but licenses their designs to others. The PS Vita uses the same graphics setup as the new iPad: 4 SGX543 cores. TI had used PowerVR in the last several generations of OMAP products.

Re:PowerVR, eh? (2)

TheRaven64 (641858) | more than 2 years ago | (#39318707)

I didn't know the PowerVR chips were still around.

PowerVR has been doing very well in mobile devices for years. ARM's Mali is just starting to take away some market share from them, but before that most ARM SoCs came with a PowerVR GPU. Like ARM, they license the designs to anyone willing to pay, so if you want to make a SoC then a PowerVR GPU was an obvious choice - nVidia only uses the Tegra GPUs in their own chips, they don't license them. Now it's a less obvious choice, because you can license both CPU and GPU designs from ARM and the latest Mali stuff is pretty nice - full OpenCL support (not just the mobile profile).

Thank God I focus on business programming, not video games. I've yet to hear of ANY tablet or smartphone having problems displaying graphs and charts.

What about 3D transition effects for presentation software? 3D data visualisation? Augmented reality?

Last Tegra device I'll ever buy (2, Informative)

Anonymous Coward | more than 2 years ago | (#39318631)

Bought a Galaxy Tab for the Tegra 2, was so utterly disappointed. The real world performance was atrocious even compared to devices it was officially benchmarked better against. Sold it within 3 months. Still waiting on a great Android tablet....

Re:Last Tegra device I'll ever buy (5, Funny)

MobileTatsu-NJG (946591) | more than 2 years ago | (#39318807)

Well don't you worry, last week Apple announced Samsung's next tablet!

Re:Last Tegra device I'll ever buy (1)

arbiter1 (1204146) | more than 2 years ago | (#39318939)

yea tegra2 was not that good, boxee box was planned to use that but they scrapped it due to lacking power to play 1080 high profile video.

Don't worry, Nvidia! (5, Insightful)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39318633)

Just ask Intel about Apple's benchmarking strategy: For years, the finest in graphic design publicly asserted that PPC was so bitchin' that it was pretty much just letting Intel and x86 live because killing your inferiors is bad taste. Then, one design win, and x86 is suddenly eleventy-billion percent faster than that old-and-busted PPC legacy crap.

Or ask Amazon: Amazon releases 'Kindle' e-reader device. His Steveness declares "Nobody reads". And now Apple is pushing books, newspapers, and their own pet proprietary publishing platform...

Cheer up, emo Nvidia, all you have to do is sell Apple a Tegra N SoC, or even just the rights to include your GPU in their AN SoC, and Tim Cook will personally explain to the world that PowerVR GPUs are slow, weak, make you 30% less creative and are produced entirely from conflict minerals.

Re:Don't worry, Nvidia! (0)

Anonymous Coward | more than 2 years ago | (#39318699)

[Citation Needed]

Re:Don't worry, Nvidia! (2)

Nerdfest (867930) | more than 2 years ago | (#39318859)

Here [apple.com]

Re:Don't worry, Nvidia! (5, Funny)

MisterMidi (1119653) | more than 2 years ago | (#39318881)

I don't know why you need a citation of what fuzzyfuzzyfungus wrote, but here you go:

Just ask Intel about Apple's benchmarking strategy: For years, the finest in graphic design publicly asserted that PPC was so bitchin' that it was pretty much just letting Intel and x86 live because killing your inferiors is bad taste. Then, one design win, and x86 is suddenly eleventy-billion percent faster than that old-and-busted PPC legacy crap.

Or ask Amazon: Amazon releases 'Kindle' e-reader device. His Steveness declares "Nobody reads". And now Apple is pushing books, newspapers, and their own pet proprietary publishing platform...

Cheer up, emo Nvidia, all you have to do is sell Apple a Tegra N SoC, or even just the rights to include your GPU in their AN SoC, and Tim Cook will personally explain to the world that PowerVR GPUs are slow, weak, make you 30% less creative and are produced entirely from conflict minerals.

Re:Don't worry, Nvidia! (5, Informative)

TheRaven64 (641858) | more than 2 years ago | (#39318745)

Just ask Intel about Apple's benchmarking strategy: For years, the finest in graphic design publicly asserted that PPC was so bitchin' that it was pretty much just letting Intel and x86 live because killing your inferiors is bad taste. Then, one design win, and x86 is suddenly eleventy-billion percent faster than that old-and-busted PPC legacy crap.

This wasn't totally misleading. The G4 was slightly faster than equivalent Intel chips when it was launched and AltiVec was a lot better than SSE for a lot of things. More importantly, AltiVec was actually used, while a lot of x86 code was still compiled using scalar x87 floating point stuff. Things like video editing - which Apple benchmarked - were a lot faster on PowerPC because of this. It didn't matter that hand-optimised code for x86 could often beat hand-optimised code for PowerPC, it mattered that code people were actually running was faster on PowerPC. After about 800MHz, the G4 didn't get much by way of improvements and the G5, while a nice chip, was expensive and used too much power for portables. The Pentium M was starting to push ahead of the PowerPC chips Apple was using in portables (which got a tiny speed bump but nothing amazing) and the Core widened the gap. By the Core 2, the gap was huge.

It wasn't just one design win, it was that the PowerPC chips for mobile were designs that competed well with the P2 and P3, but were never improved beyond that. The last few speedbumps were so starved for memory bandwidth that they came with almost no performance increase. Between the P3 and the Core 2, Intel had two complete microarchitecture designs and one partial redesign. Freescale had none and IBM wasn't interested in chips for laptops.

Re:Don't worry, Nvidia! (1)

Anonymous Coward | more than 2 years ago | (#39318811)

Spec benchmarks never supported any of Apple's claims, int or floating performance.

Re:Don't worry, Nvidia! (1)

TheRaven64 (641858) | more than 2 years ago | (#39318865)

SPEC benchmarks are irrelevant to most users (they're actually irrelevant to most HPC users too - they're for dick waving, not for anything else). Important benchmarks are things like comparing complex Adobe Photoshop filter application times, because that's what translates to real money for users. Time spent waiting for the computer to do things is time spent not getting anything done that you get paid for. And these benchmarks were verified by other people.

As I said in my post, a lot of the difference was due to the fact that Apple made it easy for people to use AltiVec on MacOS (including providing things like the Accelerate framework, that implemented a lot of common signal processing algorithms for you), while using SSE on Windows was a lot harder and rarer. AltiVec code was at least a factor of 4 faster than x87 code, often more.

Re:Don't worry, Nvidia! (0)

Anonymous Coward | more than 2 years ago | (#39319387)

Trying to extrapolate CPU speed from a specific application and specific operation and calling that more relevant is pure fanboism plain and simple. Spec benchmarks (while not perfect by any means) give a much better indicator, it's harder to pick and choose. If you knew anything about specbenchmarks you wouldn't be making such a silly claim of a single Adobe filter being faster on one platform and trying to make the claim of winning the speed contest.

But then again that is what Apple has always done. Tried to find even a single instance of a single operation it's faster in, then declaring itself the winner, so at least you are consistent with your own fanboism.

For those of us that *actually* care about real performance we will stick with a much greater mix of benchmarks and real world application tests.

Re:Don't worry, Nvidia! (2)

TheRaven64 (641858) | more than 2 years ago | (#39319459)

If you knew anything about specbenchmarks

I work on a compiler used in HPC. I know a fair amount about SPEC benchmarks and how little they're trusted outside of dick waving lists.

you wouldn't be making such a silly claim of a single Adobe filter being faster on one platform and trying to make the claim of winning the speed contest.

In the compiler world, we say that there is only one benchmark that really matters: your code. Apple's core market at that time was people running the Adobe creative suite. This suite ran faster on Macs than on Windows. Whether that was due to the processor, the compiler, or better code, is irrelevant to the user. The user cares about how much this expensive purchase will allow them to earn.

Re:Don't worry, Nvidia! (1)

Anonymous Coward | more than 2 years ago | (#39319077)

This wasn't totally misleading. The G4 was slightly faster than equivalent Intel chips when it was launched and AltiVec was a lot better than SSE for a lot of things.

Only if you were comparing CPUs running at the same frequency. Intel has had the best manufacturing process for decades, and their offerings had no problem beating the PPC because of Intel's ridiculously high frequencies (remember the Pentium IV and how it felt like an oven?). And the reason Intel did this wasn't even because they cared about Apple -- it was to beat AMD, which also had better performance per clock.

Re:Don't worry, Nvidia! (0)

Anonymous Coward | more than 2 years ago | (#39319445)

No. Just No.

I remember when the front page of Apple's website had the amazing AltiVec benchmarks showcasing the incredible performance of the Mac. Did you read the fine print where Apple DISABLED SSE in every micro benchmark?

Its devious to screw with your own hardware and benchmarks to increase your products score (*cough* nVidia). Its quite another to go out of your way to handicap your competitors products. GP is right. Apple will lie and twist the data to market their products as the best and their competitors are terrible and should never be used... until Apple uses them that is.

cure for the blue face (4, Informative)

epine (68316) | more than 2 years ago | (#39319845)

This wasn't totally misleading. The G4 was slightly faster than equivalent Intel chips when it was launched and AltiVec was a lot better than SSE for a lot of things. More importantly, AltiVec was actually used, while a lot of x86 code was still compiled using scalar x87 floating point stuff.

This was totally misleading, for any informed definition of misleading.

Just as there are embarrassingly parallel algorithms, there are embarrassingly wide instruction mixes. In the P6 architecture there were a three uop/cycle retirement gate, with a fat queue in front. If your instruction mix had any kind of stall (dependency chain, memory access, branch mispredict) the retirement usually caught up before the queue was filled. In the rare case (Steve Jobs' favorite Photoshop filter) where the instruction mix could sustain a retirement rate of 4 instructions per cycle, x86 showed badly against PPC. Conversely, on bumpy instruction streams full of execution hazards, x86 compared favourably since it had superior OOO head-room.

CoreDuo rebalanced the architecture primarily by adding a fair amount of micro-op fusing, so that one retirement slot effectively retired two instructions (without increasing the amount of retirement dependency checking in that pipeline stage). In some ways, the maligned x86 architecture starts to shine when your implementation adds the fancy trick of micro-op fusion, since the RMW addressing mode is fused at the instruction level. In RISC these instructions are split up into separate read and write portions. That was an asset at many lithographic nodes. But not at the CoreDuo node, as history recounts. Now x86 has caught up on the retirement side, and PPC is panting for breath on the fetch stream (juggling two instructions where x86 encodes only one).

The multitasking agility of x86 was also heavily and happily used. It happens not to show up in pure Photoshop kernels. Admittedly, SSE was pretty pathetic in the early incarnations. Intel decided to add it to the instruction set, but implemented it double pumped (two dispatch cycles per SSE operation). Of course they knew that future devices would double the dispatch width, so this was a way to crack the chicken and egg problem. Yeah, it was an ugly slow iterative process.

The advantage of PPC was never better than horses for courses, and PPC was picky about the courses. It really liked a groomed track.

x86 hardly gave a damn about a groomed track. It had deep OOO resources all the way through the cache hierarchy to main memory and back. The P6 was the generation where how you handled erratic memory latency mattered for important workloads (ever heard of a server?) than the political correctness of your instruction encoding.

Apple never faltered in waving around groomed track benchmark numbers as if the average Mac user sat around and ran Photoshop blur filters 24 by 7. That was Apple's idea of a server workload.

mov eax, [esi]
inc eax
mov [esi], eax

That's a RISC program in x86 notation. Whether the first and second use of [esi] amounts to the same memory location as any other memory access that OOO might interleave is a big problem. That's a lot of hazard detection to do to maintain four-wide retirement.

Here is a CISC program in x86 notation. I can't show it to you in PPC notation, since PPC is a proper subset minus this feature.

inc [esi]

Clearly, with a clever implementation, you can arrange that the hazard check against potentially interleaved accesses to memory is performed once, not twice. It takes a lot of transistors to reach the blissful state of clever implementation. That's precisely the story of CoreDuo. It finally hit the bliss threshold (helped greatly that the Prescott people and their marketing overlords were busy walking the green plank).

Did Apple tell any of this story in vaguely the same way? Nooooo. It waved around one embarrassingly wide instruction stream that appealed to cool people until it turned blue in the face.

Cure for the blue face: make an about face.

Do I trust this new iPad 3 benchmark? Hahahahahaha. You know, I've never let out my inner six year old in 5000 posts, but it feels good.

PPC v Intel x86 - A Mac game dev's perspective (4, Interesting)

perpenso (1613749) | more than 2 years ago | (#39318943)

Having back-in-the-day written a fair bit of code that ran on both PPC and Intel x86, including a bit of assembly for both, I'd agree that Apple's comparisons were more a work of marketing than engineering but PPC legitimately had its moments. Apple used phrases like "up to twice as fast" and there were certainly cases where this was true, however these tended to be very specialized situations where the underlying algorithm played to the natural strengths of the PPC architecture. Such case do not represent the more general code and common algorithms. In general my recollection of those days is that PPC had about a 25% performance advantage over x86. However this advantage was nullified by Intel's ability to reach much higher clock rates.

Overall, as a Mac game developer, it took a bit of effort to get Mac ports on a par with their PC counterparts. One caveat here, emphasize "port" - that the games tended to have been written with only x86 in mind. Contrary to popular belief it is entirely possibly to write code in high level languages that favor one architecture over the other, CISC or RISC, etc. So the x86 side may have had an advantage in that the code was naturally written to favor that architecture. However a counterpoint would be that we did profile extensively and re-write perfectly working original code where we thought we could leverage the PPC architecture. This included dropping down to assembly when compilers could not leverage the architecture properly. Still, this only achieved parity.

Again, note this was back-in-the-day, games that were not using a GPU. So it was more of a CPU v CPU comparison.

Re:PPC v Intel x86 - A Mac game dev's perspective (2)

ShooterNeo (555040) | more than 2 years ago | (#39319141)

Out of curiosity, did the Mac sales bring in enough revenue to be worth all the costs of doing the porting?

Re:PPC v Intel x86 - A Mac game dev's perspective (1)

perpenso (1613749) | more than 2 years ago | (#39319705)

Out of curiosity, did the Mac sales bring in enough revenue to be worth all the costs of doing the porting?

I never saw financials but the publisher continued to support the Mac throughout the PPC era. Now in the x86 era its a little bit easier to do the port and the Mac market is many times larger. I think doing a Mac version of a game today is much less risky.

The Law of Advertised Benchmarks (0)

Anonymous Coward | more than 2 years ago | (#39318635)

Given any two devices X and Y, X is significantly faster than Y.

This confuses many people because in general usage of the word "faster" two different devices can't both be faster than the other. But it's the accepted industry standard.

Re:The Law of Advertised Benchmarks (1)

Surt (22457) | more than 2 years ago | (#39318855)

It comes down to the laws of advertising, in which 'faster' is legally constrained to 'as fast as'. So both X can be faster than Y, and Y can be faster than X, if they are the same speed.

Except it's not a repeat of the PowerVR case at al (1)

Anonymous Coward | more than 2 years ago | (#39318647)

The last TBDR vs. rasterizer wars were before the rasterizers added aggressive depth compression and hierarchical Z buffering solutions, which eliminated many of the advantages of the TBDR architecture, especially as triangle rates have risen (which have additional costs on a TBDR).

If TBDR was always a huge advantage, one of nvidia or ATI would surely have gone that way - why ignore a 'better' technology if it really is better?

It's just 'different' - under different scenes the two have somewhat different tradeoffs.

Tablets not GPU-limited, they're money-limited (3, Interesting)

volcanopele (537152) | more than 2 years ago | (#39318721)

The graphics capabilities of both the iPad (2nd and 3rd gen) and Tegra 3 tablets are more than capable of playing high quality games. At the very least, direct ports from the last console generation (like GTA III and The Bard's Tale) run just fine on both types of tablet devices. The problem is not the GPU of either Apple or Google's tablets. The problem is money -- how much money are developers willing to spend on producing a game where the max selling price is ~$10 (I've only seen >$15 on the Final Fantasy ports). This limits the scope of mobile games on either OS to either pretty tech demos (like Infinity Blade), games designed to the lowest common device (think Gameloft's games), cheaply designed casual games that don't push the GPU in the slightest (Angry Birds, Jetpack Joyride), or ports of older games (FF Tactics, GTA III, The Bard's Tale).
Don't get me wrong, I love gaming on my iPad (or at least I like it enough to have no desire to get a PS Vita), but there are few games that truly push the GPU because there is just no money in it to do so. Until people are willing to pay $30-40 for a top-notch game on their mobile device, we won't.

and before someone says that touchscreens are another factor, please, that's only a problem with ports (or developers who think touchscreen games are just like console or handheld games without thinking (*cough*EA sports*cough*). Fighting games that require you to hit a bunch of virtual buttons are wretched on a touch screen device. fighting games like Infinity Blade are pretty fun because they take advantage of the touch screen, rather than treat the screen like a virtual controller. I actually did like GTA III, but I often had to find alternative ways to complete missions because running and gunning was more difficult than using the sniper rifle.

Amazing (1)

Wovel (964431) | more than 2 years ago | (#39318759)

Nvidia is stupid enough to take the bait. Good job.

Re:Amazing (0)

Anonymous Coward | more than 2 years ago | (#39318785)

There is no such thing as bad press, so long as it attracts a lot of attention and divides the group into two opposing, passionate sides.

it should be true (1)

Espectr0 (577637) | more than 2 years ago | (#39318781)

the old ipad 2 is faster than the tegra 3, according to arstecnica, so it should make sense that the new ipad is even faster. i can't find the link but i saw it a few days ago, maybe here in a comment

Re:it should be true (1)

UnknowingFool (672806) | more than 2 years ago | (#39318861)

Well there was this slasdot article [slashdot.org] however the summary is misleading in that it claims the Tegra3 beat the A5 but reading the article, it appears that the A5 beat the Tegra 3. For the most part, the two could not be compared as the Tegra 3 ran Android benchmarks which cannot be applied to Apple and the A5.

Apple's numbers make sense (5, Insightful)

RalphBNumbers (655475) | more than 2 years ago | (#39318887)

We recently saw a graphics benchmark of the A5 vs the Tegra3 posted to /., and the A5 beat the Tegra in real-world-ish benchmarks, and more than doubled it's score in fill rate. [hothardware.com]

The A5X is basically just the A5 with twice as many GPU cores, and graphics problems tend to be embarrassingly parallel, so unless it scales up really poorly with those extra cores (due to shared bandwidth limitations, or poor geometry scaling) it should have no problem beating the Tegra 3 by 2x, especially in terms of fill rate.

And when you quadruple the number of pixels on your screen, as Apple just did, which measurement matters? Fill rate.

iPad 2 Already Beat Tegra 3 (4, Interesting)

TraumaHound (30184) | more than 2 years ago | (#39318899)

Considering that these graphics benchmarks from Anandtech [anandtech.com] show the iPad 2 GPU handily beating a Tegra 3, it doesn't seem like much of a stretch that the iPad 3 GPU should beat it further.

Re:iPad 2 Already Beat Tegra 3 (0)

Anonymous Coward | more than 2 years ago | (#39319183)

That is using tegra3 benchies running on gingerbread
the real problem here is fragmentation.

who the hell cares? (4, Insightful)

milkmage (795746) | more than 2 years ago | (#39319157)

tegra smegma a5x tri-dual-octo-quad core ACME RX3200 Rocket Skates GigaHertzMegaPixelPerSecond my asshole graphics is irrelevant.
the ONLY thing that matters is how it works when its in your hands.

does it drive 2048x1536 at least as well as the ipad 2? yes or no.

the way i see it, neither NVIDIA or Apple can say anything about relative performance because there is nothing using tegra at that resolution.. you can benchmark/extrapolate all you want, but all that matters is real world.

the "quad core A5X GPU" damn well better be faster beause it's driving 4x as many pixels.

PowerVR has it's drawbacks (1)

TrancePhreak (576593) | more than 2 years ago | (#39319355)

I wonder how well they both fair with heavy use of alpha blending. I know this will cause big problems for the tile based PowerVR chips.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?