Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia Demos 'Kal-El' Quad-Core Tegra Mobile CPU

timothy posted more than 3 years ago | from the superman-approved dept.

Graphics 109

MojoKid writes "Nvidia just took the wraps off their first quad-core Tegra mobile processor design at the Mobile World Conference today and it's a sight to behold. Dubbed Kal-El, the new chip will be capable of outputting 1440P video content and offer 300 DPI on devices with a 10.1" display. Nvidia is claiming that Kal-El will deliver 5x the performance of Tegra 2 and ship with a 12-core GeForce GPU as well. The company has also posted two different videos of Kal-El in action."

cancel ×

109 comments

Sorry! There are no comments related to the filter you selected.

Their other projects are also superheroes (1)

mykos (1627575) | more than 3 years ago | (#35219990)

Slated for the next four years are "Wayne", "Logan", "Powdered Toast Man", and "The Tick".

Re:Their other projects are also superheroes (1)

rsmith-mac (639075) | more than 3 years ago | (#35220186)

In true Tick fashion, even on Slashdot he doesn't get any respect. Here he gets listed after a hero that farts toast and burns the US Constitution for warmth.

Re:Their other projects are also superheroes (0)

Anonymous Coward | more than 3 years ago | (#35220376)

[A] hero that farts toast and burns the US Constitution for warmth.

Oooh! I know! I know! Pick me! ...It's George W Bush, isn't it?!

Re:Their other projects are also superheroes (1)

gl4ss (559668) | more than 3 years ago | (#35220228)

LOG LOG EVERYBODY NEEDS A LOG LOG LOG IT'S BETTER THAN BAD ITS GOOD. Yes, log. All nations love Log. So, hurry now
to your local store and be the first in your country to have the International Log.

but really what they're promising here is taking the old dual core chip and doubling everything and then selling those specs as the real deal.

what happens to power use? is this faster than building a system with two tegras?

Re:Their other projects are also superheroes (1)

bhtooefr (649901) | more than 3 years ago | (#35220268)

Can't build a system with two Tegras, as they're not CPUs, they're SoCs - it'd be two systems on the same motherboard. (Which would be interesting for some server applications, but still...)

Re:Their other projects are also superheroes (1)

TheRaven64 (641858) | more than 3 years ago | (#35220944)

You could put a fast interconnect between them and run some SSI cluster OS. As long as you were careful with the scheduling, they'd look like one NUMA machine with 8 cores and two GPUs.

Re:Their other projects are also superheroes (1)

bhtooefr (649901) | more than 3 years ago | (#35221052)

I'm assuming we were talking about Tegra 2s, so it'd look like one NUMA machine with 4 cores and two 4-"core" GPUs, versus the Kal-El chip, which can run a normal OS, has 4 cores, and runs a 12-"core" GPU.

Re:Their other projects are also superheroes (2)

Lord Grey (463613) | more than 3 years ago | (#35220482)

Slated for the next four years are ....

Don't count your weasels before they pop, dink.

-- The Tick

Re:Their other projects are also superheroes (0)

Anonymous Coward | more than 3 years ago | (#35228398)

I'll rely on Arthur to deliver.

Re:Their other projects are also superheroes (1)

nomorecwrd (1193329) | more than 3 years ago | (#35223042)

Let's hope this doesn't become a Blur

1440p? (1)

Vectormatic (1759674) | more than 3 years ago | (#35219994)

finally, we get our high-resolution screens?

Too bad it'll probably be at the cost of having to upgrade everything to blu-ray 2.0 or something...

Re:1440p? (2, Insightful)

chenjeru (916013) | more than 3 years ago | (#35220032)

1440 is a version of 1080p. It still has 1080 lines of horizontal resolution, but only 1440 vertical lines instead of the standard1920. This format uses non-square pixels to fill a 16x9 aspect.

Re:1440p? (2, Informative)

GerbilSoft (761537) | more than 3 years ago | (#35220062)

1440 is a version of 1080p. It still has 1080 lines of horizontal resolution, but only 1440 vertical lines instead of the standard1920. This format uses non-square pixels to fill a 16x9 aspect.

This right here is why "HD" is a joke. You've got 1366x768 "720p" displays that are only capable of showing 1280x720 signals, and now there's "1440p" displays that are non-square 1440x1080 instead of the expected 2560x1440. Either that or you're mistaken, since the slides in TFA mention 2560x1600.

Re:1440p? (1)

timeOday (582209) | more than 3 years ago | (#35222032)

Almost anything with a dual-dvi output can drive a 2560x1600 external display. I was hoping they meant it had accelerated rendering for above-1080p-video, which actually would have been cool.

Display resolutions seem to be going down. 1600x1200 laptops were common for a while. Even 1 1/2 years ago I could get a 1080p monitor for under $100, but a couple weeks ago I needed another one and all the monitors around $100 are only 1600x900 if not some weird resolution slightly less than that.

Re:1440p? (1)

Kjella (173770) | more than 3 years ago | (#35224026)

No. HDV is 1440x1080i at maximum, but 1440p refers to the vertical resolution. 2560x1440 is not a very common resolution but found in monitors like Dell UltraSharp U2711, Apple LED Cinema Display 27 and Nec NEC SpectraView Reference 271W. You'll have an easier time finding ice cream in Sahara than native 1440p content though.

Re:1440p? (1)

Salamander_Pete (1377479) | more than 3 years ago | (#35220078)

1440 is a version of 1080p. It still has 1080 lines of horizontal resolution, but only 1440 vertical lines instead of the standard1920. This format uses non-square pixels to fill a 16x9 aspect.

No, they mean 2560x1440 with progressive scan.

Re:1440p? (0)

HappyHead (11389) | more than 3 years ago | (#35221308)

No, they mean 2560x1440 with progressive scan.

Well that's a suck-tastic downgrade from their current and past video card lines.

I'm sitting at a workstation with a pair of 2560x1600 resolution monitors right now, on an old Quatro FX 4600 NVidia card that runs them just fine. (Seriously - it's nowhere near their former top of the line, which is in the workstation at the other end of the table from me...). Since their old cards could do better, and they're now bragging about being able to do less, why should we be impressed?

Or is this a sign that the HDTV induced stagnation in the monitor market is finally going to get broken? The monitors I've got in my lab here at work cost about $3000 when they were new two years ago, but to replace them right now would cost $5000 each because "nobody wants anything but 1920x1080". I hate HDTV with a passion, because it screwed up my monitor purchasing plans. The last monitor I bought for myself, right before HDTV came out and screwed us all, was 1920x1200 - you can't even find a monitor with that high of a resolution anymore.

Re:1440p? (0)

Anonymous Coward | more than 3 years ago | (#35221354)

Well that's a suck-tastic downgrade from their current and past video card lines.

This is a mobile CPU/GPU combo intended for cell phones, tablets, etc. It does not (or at least is not intended to) replace the usual discrete desktop graphics cards.

Re:1440p? (0)

Anonymous Coward | more than 3 years ago | (#35221428)

Ah - having reviewed the article, I see that they are still producing 2560x1600 resolution, and are simply using that old capability as new bragging material. (NVidia's cards have been able to handle that resolution for years.) Perhaps the use of that as promotional material really is an indication that the monitor makers are finally going to let us get better screens. If the gamer audience _knows_ their video card is capable of that level of performance, they'll start demanding that the manufacturers give them hardware that can use it.

Did I mention how much I hate HDTV?

Re:1440p? (1)

Rennt (582550) | more than 3 years ago | (#35221450)

Well that's a suck-tastic downgrade from their current and past video card lines.

But a massive upgrade from current mobile SOCs. Honestly, these are designed for tablets... why the hell are you blathering about high res monitors and discrete chipsets?

Re:1440p? (1)

Salamander_Pete (1377479) | more than 3 years ago | (#35221524)

Well that's a suck-tastic downgrade from their current and past video card lines.

But a massive upgrade from current mobile SOCs. Honestly, these are designed for tablets... why the hell are you blathering about high res monitors and discrete chipsets?

Presumably they missed the words 'Tegra' and 'mobile' in the title, and 'Mobile World Conference' in the summary, plus most of TFA.

Re:1440p? (1)

wisty (1335733) | more than 3 years ago | (#35221898)

Yeah, but it means that my crappy netbook with Intel graphics can drive some of the best monitors on the market. No more monitor jealousy, as everyone is brought down to the same level ..

except for 27 inch iMac users, who get 2560 x 1440.

Re:1440p? (1)

KronosReaver (932860) | more than 3 years ago | (#35222956)

The last monitor I bought for myself, right before HDTV came out and screwed us all, was 1920x1200 - you can't even find a monitor with that high of a resolution anymore.

Dell Ultrasharp 2410 - 24" IPS screen @ 1920x1200 native resolution.

$600 MSRP

$500 or so in practice though

Re:1440p? (0)

Anonymous Coward | more than 3 years ago | (#35227610)

Or a Samsung 2443BW.

I am in Australia and bought one not long ago for about $340.

Re:1440p? (3, Informative)

beelsebob (529313) | more than 3 years ago | (#35220150)

not true. 1440p, as with 720p and 1080p refers to the number of rows. 1440p would be 1920 pixels wide at 4:3 or 2560 pixels wide at 16:9.

Re:1440p? (1)

chenjeru (916013) | more than 3 years ago | (#35220372)

You may be correct in this case since it does seem that 1440p is a different format than what I was describing. However, my referenced format does indeed exist, it's the HDV1080 standard, which is 1440*1080 with pixels at 1.33:1 (anamorphic).

Re:1440p? (0)

dave420 (699308) | more than 3 years ago | (#35220968)

Yes, but that's not what we're talking about here. 1440p is not 1080p at 1.33:1.

slashdot making assumptions again (4, Informative)

OrangeTide (124937) | more than 3 years ago | (#35220654)

Their demonstration showed 2560x1440 content.

Re:slashdot making assumptions again (1)

chenjeru (916013) | more than 3 years ago | (#35221014)

I had assumed the lesser value of 1440 (HDV) versus the greater (1440p). Mark me as corrected, and pleasantly surprised.

forgivable (0)

Anonymous Coward | more than 3 years ago | (#35224648)

well it is not a mode available for HDMI and is a standard DVI mode. I believe they used a 30" 2560x1600 DVI monitor to display content (2560x1440 is 16:9, 1600 would have been only 16:10). Because HDMI doesn't specify such a mode you won't find any TVs doing 1440p, and you can't do HDMI audio to a typical DVI monitor. But I believe the point of the demonstration is to show that the new chip is faster than the old chip rather than demonstrate some technology that will show up on the market.

Re:1440p? (1)

kimvette (919543) | more than 3 years ago | (#35222364)

http://en.wikipedia.org/wiki/Extreme_High_Definition [wikipedia.org]

A related term is Extreme Definition (or XD). This is a term used on the Internet[citation needed], referring to the 1440p - 2560x1440 - resolution. The term was formulated with Extreme High Definition in mind, since both standards share the 2560 pixel horizontal resolution. To avoid confusion between the two resolutions, however, the word high was left out.

For several months, the only device which output this as its native resolution was Apple's 27-inch iMac. As of February 10, 2010 Dell has introduced a 27" 1440p display.

Re:1440p? (1)

RMingin (985478) | more than 3 years ago | (#35222740)

Incorrect. 1440p = double 720p. 2560*1440. I believe you're thinking of 4:3 1080p, which was 1440*1080. When referring to HD resolutions by their single number nicknames, it's always the vertical resolution that is named.

Re:1440p? (1)

dfghjk (711126) | more than 3 years ago | (#35223038)

This gets modded Insightful...

Re:1440p? (0)

Anonymous Coward | more than 3 years ago | (#35226014)

Why people mod up stuff that's just plain incorrect?

Don't bother with the videos... (1)

sosaited (1925622) | more than 3 years ago | (#35220006)

You'll be just as disappointed as I was. It doesn't fly at all.

Re:Don't bother with the videos... (0)

Anonymous Coward | more than 3 years ago | (#35220174)

Did you try burying it? I hear it will form a giant crystalline fortress...

Kneel before who, now? (2)

Tumbleweed (3706) | more than 3 years ago | (#35220026)

So performance similar to a Core 2 Duo (T72000) in a phone? Sa-weet! Gimme a dock so I can plug my 'phone' into and use my monitor/mouse/keyboard/internet connection, and that's all the computer I'll need for most purposes. I'll figure up the big boy when I need to use Photoshop or other intensive things.

Re:Kneel before who, now? (1)

stms (1132653) | more than 3 years ago | (#35220188)

If the dock had an GPU on it you wouldn't even need "the big boy."

Re:Kneel before who, now? (3, Interesting)

FatdogHaiku (978357) | more than 3 years ago | (#35220216)

Re:Kneel before who, now? (1)

Tumbleweed (3706) | more than 3 years ago | (#35220576)

No, not like that. That plugs the phone into a netbook-like enclosure. I'm talking about a dock that lets you use a real monitor, etc.

Re:Kneel before who, now? (1)

Guspaz (556486) | more than 3 years ago | (#35221188)

The Motorola Atrix also has a dock that does exactly that. The dock has a few USB ports for keyboards/mice, and HDMI out to plug in a desktop monitor.

Re:Kneel before who, now? (1)

stdarg (456557) | more than 3 years ago | (#35220288)

I think it's more for netbooks and stuff. They talk about a 1440p 10.1" screen. They compare it to the Atom and Core2Duo, which are not used in phones.

Re:Kneel before who, now? (1)

crhylove (205956) | more than 3 years ago | (#35225666)

Um, photoshop runs fine on a core 2 duo. And why not use Gimp instead? It's free, does nearly everything photoshop does, and is also much less resource intensive. Adobe is like the Real of this decade. We're all going to be so glad when they're gone like Real is now.

Geez, I'm made out of kryptonite (1)

Anonymous Coward | more than 3 years ago | (#35220100)

Geez, I'm made of kryptonite. This is unfair!

KAL-EL IS SUPERMAN !! IRAN NEEDS KAL-EL !! (-1)

Anonymous Coward | more than 3 years ago | (#35220108)

Kal-el is now known as Superman. Iran needs supoerman since they sorely can't do what their arab neighbors can lickity-split !! Bahrain and whatever bumbfuck places don't matter in the least.

power consumption? (4, Insightful)

crunzh (1082841) | more than 3 years ago | (#35220210)

They write nothing about power consumption... I am disappointed. The most important benchmark of a mobile CPU is power consumption, I can stick a atom in a cellphone to get a lot of cpu power, but the batteries will be toast in no time.

Re:power consumption? (1)

Allicorn (175921) | more than 3 years ago | (#35220224)

And since they're not crowing out how great it is (which they certainly would if it were) then we can probably fairly safely assume is a relative battery-buster.

Re:power consumption? (1)

gabebear (251933) | more than 3 years ago | (#35220314)

It looks like these aren't due to be released till the end of the year(with devices using them early 2012). It's possible they simply have no good power consumption numbers yet.

Re:power consumption? (2)

Khyber (864651) | more than 3 years ago | (#35221018)

"It's possible they simply have no good power consumption numbers yet."

Sorry, even my company has the brains to hook the equipment up to a kill-a-watt during the various testing phases of product development, so we have power figures available immediately for given loads.

If nVidia can't cough up $200 in measly American Currency for ONE KAW tester, then nVidia is bound to be going the way of the dinosaur.

Oh, wait, they've already begun emulating 3dfx, by selling their own cards. We all saw how well that worked for Elpin systems.

Re:power consumption? (2)

Guspaz (556486) | more than 3 years ago | (#35221210)

They've had actual silicon for 12 days. They may have been too busy showing it off to the press to sit down and plug it into a killawatt.

Referencing the Anandtech article, nVidia claims that for the same workload, it is as efficient or more efficient than the Tegra 2, but if you increase the workload, it'll obviously use a bunch more power.

Re:power consumption? (3, Insightful)

CAIMLAS (41445) | more than 3 years ago | (#35221808)

Incorrect. The latest atoms (granted, not readily available for consumption) are fast and lower power than some of the leading ARM smartphone CPU/SoCs (or at least comparable on a perf/watt basis).

Your biggest power drains in a smartphone will be:

* Cellular and WiFi radios
* Display
* Crap software - poorly implemented drivers for the above, in addition to poorly implemented 3D/etc. drawing mechanisms which ineffectively utilize the processor, draw a lot on the screen, and so on.

Re:power consumption? (2)

MukiMuki (692124) | more than 3 years ago | (#35223650)

Ummm, no? 9 watts is low for an Atom chip. 5 watts is unheard of, though AMD is planning on something to that effect with Bobcat.

The dual-core 1ghz Tegra 2 with its embedded graphics core and 720p h.264 video decode (actually 1080p, but 720p support is a LOT more comprehensive) is 2 watts. TWO. And that's from six months ago when Nvidia's design was over power budget. It might be closer to 1 or 1.5 now. That's for the ENTIRE chipset, whereas Atom's motherboard adds another 10-20 watts.

Even the AMD chip, which has the same video decode and level of embedded graphics probably won't hold up all that well in the 5 watt range, and that's 2-3 times the power consumption.

Re:power consumption? (2)

steveha (103154) | more than 3 years ago | (#35224468)

Mod parent up. A Tegra 2 is a "system on a chip" and you don't need much else. An Atom needs support chips, and you have to look at the total power budget of the Atom plus support chips.

A Tegra is much more power-efficient than an Atom. It is not an accident that Android 3.0 tablets will be running on Tegra 2 chips, and not on Atom chips.

steveha

Re:power consumption? (1)

CAIMLAS (41445) | more than 3 years ago | (#35229022)

you, and your parent, need to do a little research.

An atom CPU is a bit more than just those 330 whatevers you can still find. There are literally dozens of variants of the "Atom" chips now.

There is indeed an Atom SoC that clocks in under 4 watts for TDP. This says nothing about idle states, which are drastically improved.

Now consider the fact that the LCD on a phone takes probably in the range of 15 watts, maybe a bit more. Then you've got the radio, which is going to reduce your battery life all the further and is comparable to the display (maybe 2/3 the TDP). Wifi? Tack it on again.

No, the CPU is not the major component in power sucking. It's negligible compared to the other components, particularly with modern designs. The latest Atom SoCs are almost as efficient as the Snapdragon and other 'latest generation' mobile phone CPUs. The Tegra 2 is interesting, but it's hardly a game breaker: it's still "just an ARM CPU". Since it doesn't appear to offer radio capabilities it offers nothing over the snapdragon or other ARM cell phone processors - it's bet is in consoles or handheld game systems. Now, an x86 cell phone, on the other hand... that has a bit more interesting potential.

Re:power consumption? (1)

crhylove (205956) | more than 3 years ago | (#35225684)

Mod up. All the drain on most android phones I've used/seen/fixed/rooted/cyanogenmodded was from the display.

That's because... (1)

denzacar (181829) | more than 3 years ago | (#35222438)

It will be harnessing the power of our yellow Sun, which will give it super-speed, super-strength, flight, x-ray vision, invulnerability and various other super-abilities and powers.

So really, you don't have to worry about power consumption. But you DO have to worry about kryptonite exposure and Lex Luthor.

Re:power consumption? (1)

Misagon (1135) | more than 3 years ago | (#35225130)

Yes, but the power consumption of the CPU is still dwarfed by the power consumption of sending and receiving radio signals, so nobody will care.

Nvidia, masters of exaggeration. 2x cores= 5X perf (1)

guidryp (702488) | more than 3 years ago | (#35220266)

They essentially have doubled the core count 2 cores, to 4 cores. They are essentially they same cores.

Are they running at 2.5X the clock speeds as well?? I seriously doubt it.

Really this goes beyond exaggeration, it is more like pure false advertising.

Re:Nvidia, masters of exaggeration. 2x cores= 5X p (1)

bhtooefr (649901) | more than 3 years ago | (#35220354)

The math IS screwy - 2 times better CPU performance (the benchmarks THEY show even show this - it's the same clock speed, same CPU cores), 3 times better GPU performance.

You don't get to add those numbers.

Still, being able to dance with a Core 2 Duo is pretty damn good.

not really an exaggeration (0)

Anonymous Coward | more than 3 years ago | (#35220748)

double cpus, double gpus and double memory bandwidth. Yea, it really is about twice as fast.

Re:not really an exaggeration (1)

Daniel Phillips (238627) | more than 3 years ago | (#35221420)

If you have two CPUs, both going faster because of the increased memory bandwidth, average aggregate throughput will more than double because not every single instruction references memory and memory bandwidth is not continuously maxed out.

Re:Nvidia, masters of exaggeration. 2x cores= 5X p (0)

Anonymous Coward | more than 3 years ago | (#35225610)

Still, being able to dance with a Core 2 Duo is pretty damn good.

But can it dance with the devil in the pale moonlight?

Last Son of Krypton (0)

LastGunslinger (1976776) | more than 3 years ago | (#35220334)

According to Nvidia's press release, the chip is more powerful than a locomotive and able to leap buildings in a single bound. Powered by Earth's yellow sun, its only weakness is kryptonite.

Name (1, Funny)

T.E.D. (34228) | more than 3 years ago | (#35220360)

...named of course after terrorist mastermind Kalel Sheikh Mohammed.

Re:Name (1)

CAIMLAS (41445) | more than 3 years ago | (#35221818)

Are you kidding, or do you have something to back that up? It's as plausible as anything, I suppose, but I'm interested in hearing reasoning if indeed this is true.

Re:Name (1)

theMAGE (51991) | more than 3 years ago | (#35222402)

He's just being silly - the guy's name is actually Khalid not Khalel.

Re:Name (0)

Anonymous Coward | more than 3 years ago | (#35222512)

They codename based on super heroes, Kal-El is Supermans Kryptonian name.

Re:Name (1)

T.E.D. (34228) | more than 3 years ago | (#35224068)

Still, they ought to name them after terrorists instead. D.C. Comics is quite likely to sue them for trademark infringement. If terrorists do the same, NVidia can happily offer to meet them in court...and then forward the summons with the court date to the FBI. We could finaly capture Bin Laden!

Re:Name (0)

Anonymous Coward | more than 3 years ago | (#35223620)

Are you fucking joking? The OP of this thread was clearly joking. Are you just trolling? Or culturally illiterate?

Kal-El is the Kryptonian name of Clark Kent aka Superman.

Re:Name (0)

Anonymous Coward | more than 3 years ago | (#35225410)

Kal-El is Superman's Kryptonian name. I would assume that the gp knows this.

Re:Name (0)

Anonymous Coward | more than 3 years ago | (#35229612)

Superman's birth name was Kal-El. The upcoming Nvidia products all use super hero's real (for lack of a better word) names. Logan for Wolverine, Stark for Iron Man, etc...

http://en.wikipedia.org/wiki/Superman

Re:Name (1)

QuaveringGrape (1573239) | more than 3 years ago | (#35222682)

Named after Superman's Kryptonian name. [wikipedia.org]

Kid's these days ain't got no culture.

Re:Name (0)

Anonymous Coward | more than 3 years ago | (#35224156)

You lose geek points. Kal-El is Superman's Kryptonian name. It's also similar to Hebrew for "Voice of God".

"offer 300 DPI on devices with a 10.1'' display" (0)

Anonymous Coward | more than 3 years ago | (#35220476)

What is this bullshit? Can you just give us the maximum framebuffer dimension and refresh rate? Why do people STILL think you need more "graphics power" and more video memory to display 1024*768 on a 27" screen than 1024*768 on a 21" screen? What the hell...

Re:"offer 300 DPI on devices with a 10.1'' display (1)

blackraven14250 (902843) | more than 3 years ago | (#35220684)

The size of the screen is used in conjunction with the DPI, not the raw resolution.

Re:"offer 300 DPI on devices with a 10.1'' display (0)

Khyber (864651) | more than 3 years ago | (#35221032)

Mentioning DPI is a useless measurement. This is how we can tell nVidia is clawing desperately at nothing trying to market something that isn't worth half a shit anyways.

When companies start going by absolutely useless metrics (DPI is PURELY dependent upon the screen maker,) you know they've hit a hard spot and are doing ANYTHING to make money - INCLUDING LYING.

And nVidia is definitely lying/intentionally misleading customers by trying to claim there's enough power to push a certain resolution @ 300DPI.

Looks like my next card purchase will be Matrox. I'm done gaming anyways, nothing has been innovative since Portal.

Re:"offer 300 DPI on devices with a 10.1'' display (1)

0123456 (636235) | more than 3 years ago | (#35221110)

Mentioning DPI is a useless measurement. This is how we can tell nVidia is clawing desperately at nothing trying to market something that isn't worth half a shit anyways.

Uh-huh.

As I understand it this chip is aimed at netbooks, which typically have 10" displays with too few pixels to display much (e.g. mine is 1024x600, which is too small to be really usable for many GUI apps). So DPI on a 10" display is a useful number in determining the screen resolution those computers would be able to support.

Though, of course, saying 1440P is more useful.

Re:"offer 300 DPI on devices with a 10.1'' display (1)

Khyber (864651) | more than 3 years ago | (#35221124)

Well, saying 1440p doesn't say much, as you're missing aspect ratio.

Re:"offer 300 DPI on devices with a 10.1'' display (1)

blackraven14250 (902843) | more than 3 years ago | (#35221418)

No, they're saying 300 DPI on a 10.1" display, not 300DPI @ 1440p. It's literally the title of every post in this thread, and you failed to see it.

It's also not a useless metric since Apple decided to advertise their brand new 300DPI screen for the iPhone 4. They're making the comparison that's been out there as a benchmark for clarity as set by another company. They're just saying their chip can do it on a 10.1" screen.

Re:"offer 300 DPI on devices with a 10.1'' display (1)

Desler (1608317) | more than 3 years ago | (#35221488)

And nVidia is definitely lying/intentionally misleading customers by trying to claim there's enough power to push a certain resolution @ 300DPI.

lolwut? The quoted section says:

and offer 300 DPI on devices with a 10.1" display.

Where did they say anything about resolution @ 300DPI? It says 300DPI on a 10.1" display. The 1440p part was a separate clause joined by the conjunction "and". Reading comprehension ftw!

Re:"offer 300 DPI on devices with a 10.1'' display (1)

shutdown -p now (807394) | more than 3 years ago | (#35223276)

Mentioning DPI is a useless measurement.

Have you missed all the "retina display" hype?

Anyway, given DPI and screen size, you can easily get the dimensions in pixels.

Re:"offer 300 DPI on devices with a 10.1'' display (1)

PitaBred (632671) | more than 3 years ago | (#35221016)

If you use 300DPI in conjunction with a 10.1" display (assuming 16:9 ratio), you get ~2560x1440 resolution. The math isn't hard.

Re:"offer 300 DPI on devices with a 10.1'' display (0)

Anonymous Coward | more than 3 years ago | (#35221482)

Now THAT would be the screen for the e-book reader that I want to buy - finally they are catching up with laser printers from 1985! No longer would you need to do awkward zooming to make out subscripts, superscripts, other small print, or diagrams... all of which are a necessity for reading scientific articles.

Huh? (0)

Anonymous Coward | more than 3 years ago | (#35220618)

Why'd they codename the chipset after Nicholas Cages kid? ...

ATI Kryptonite! (1)

DarthVain (724186) | more than 3 years ago | (#35220644)

I can't wait for AMD/ATI to come out with their new GPU code named Kryptonite!

Re:ATI Kryptonite! (0)

Anonymous Coward | more than 3 years ago | (#35221252)

Kal-el doesn't stand a chance against Zod.

Re:ATI Kryptonite! (1)

Tijok (1637233) | more than 3 years ago | (#35221948)

Well you are in luck! It already happened, 15 years ago!

"AMD's first in-house x86 processor was the K5 which was launched in 1996.[7] The "K" was a reference to Kryptonite..."

I kid, I kid.

Source: http://en.wikipedia.org/wiki/Advanced_Micro_Devices#Processor_market_history [wikipedia.org]

high profile (0)

Anonymous Coward | more than 3 years ago | (#35220646)

the real question is can it decode 1080P high profile h264, or did they shit the bed again.

Next up: Kal-L (0)

Anonymous Coward | more than 3 years ago | (#35220972)

After that: Crisis

Solar Powered (0)

Anonymous Coward | more than 3 years ago | (#35221106)

I wonder when the requirements of a solar array on the roof comes out. Also make sure you don't have any green glowing rocks around you might end up with a BSoD or a segmentation fault somewhere.

This ARM right? (1)

jabjoe (1042100) | more than 3 years ago | (#35221532)

Quad-core ARM Cortex of some kind? Guessing more than a A9? Where are the real details about this chip?

Re:This ARM right? (2)

alvinrod (889928) | more than 3 years ago | (#35222328)

If it's anything like the Tegra 2 it's going to be regular Cortex-A9 cores, an Nvidia GPU, and the usual dedicated hardware found on most ARM SoCs. Here's a picture of the Tegra 2 [anandtech.com] so I imagine that the Tegra 3 will look similar, just with more cores and a beefier GPU.

However, the Tegra 2 doesn't perform any better than the Exynos from Samsung or TI's newest OMAP based on AnandTech benchmarks [anandtech.com] , so I don't expect Tegra 3 to be much different from other parts available at the time. Considering Sony has said their next PSP, which is targeted to ship around the holidays, is going to have a 4 C-A9 cores and 4 SGX543 graphics cores, the Tegra 3 probably won't be a runaway performance monster.

What I'm most excited for are the ARM Cortex-A15 products that should be out next year. Those will allow for much higher clock rates and should make for great netbook performance. Keep the usual SoC dedicated hardware components and battery life will be even more phenomenal.

Waiting for the C&D... (1)

robnator (250608) | more than 3 years ago | (#35222286)

if DC doesn't have IP there... well I guess they'd have gone after Cage already if they did.

Maybe Windows will eventually work, then (1)

caywen (942955) | more than 3 years ago | (#35223882)

Amazing how fast the industry is ramping up the hardware capabilities of mobile devices. Among the critical problems Windows has on tablets, responsiveness might end up being solved for them. Of course, battery life and usability are still huge problems.

Super chips (1)

steveha (103154) | more than 3 years ago | (#35224628)

From the slide:

2011 Kal-el
2012 Wayne
2013 Logan
2014 Stark

That's Superman, Batman, Wolverine, and Iron Man.

There is a thread here claiming The Tick is in the list, but if so, he's not in the slide from TFA, he's not in the Wikipedia article [wikipedia.org] , and Google search doesn't know about it. It's a joke or a troll.

According to the graph, the performance to come is just crazy! Performance compared to the Tegra 2:

Kal-El: 5x
Wayne: 10x
Logan: 50x
Stark: 75x? 80x?

I'm not sure how those numbers can be real, though. A Tegra 2 is already a substantial fraction of the performance of a desktop processor. If we can have 75x that performance in a few years, are they promising to outperform desktop processors in just a few years time? Or do they think desktop processors will stop plateauing and start ramping up performance dramatically again?

Now more than ever, I want a slim netbook with a full-size keyboard, a Tegra processor, and a Pixel Qi screen. It would be great for email and web, for taking notes, etc. It would be just the thing for carrying around at a conference or convention.

steveha

Re:Super chips (0)

Anonymous Coward | more than 3 years ago | (#35224956)

I think they're counting the GPU component too. Maybe some dodgy math is involved, but I could see the 2014 ARM cpu giving 10x or better performance than the tegra 2 and the 2014 gpu side being like 100x, because embedded GPUs are much further behind desktop GPUs .

I don't know if they can do quite that much for mobile stuff (keeping it at the current 2W-and-below) though. Maybe the logan and stark stats are for desktop class variants, and the embedded one will have fewer cores and a less beefy GPU.

Re:Super chips (0)

Anonymous Coward | more than 3 years ago | (#35229652)

Well maybe they're relying on higher core count as a multiplier to performance. Which is true for applications that utilize all the cores but in the real world when you upgrade from quad core to octo core you just have six cores sitting there doing nothing most of the time instead of two. I did find the jump to dual core was amazing but after that I haven't noticed much from upping the cores.

of course, the benchmarks are stacked (0)

Anonymous Coward | more than 3 years ago | (#35227102)

http://images.anandtech.com/reviews/SoC/NVIDIA/Kal-El/DSC_1401.jpg

Clear differences in gcc versions (3.4.4 vs. 4.4.1), -O2 on the C2D, -O3 with -funroll-loops on the tegra

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>